Humans evolved strong punishment norms for a reason.
While I normally value EA members’ willingness to break social norms, I’m finding myself wishing for the chilling effect of taboos and punishments right now. The forum has done incredible damage to itself over the last few days.
Because I’m a consequentialist, and it seems like I need the the aforementioned norms to get good consequences (EA not doing irrepairable damage to its reputation) in this case.
Social organizations like EA face their own form of natural selection. EA competes for members and funding with other good-doing and intellectual communities, in an environment where prospective members and funders almost universally believe that saying “different races have different average IQs” is irredeemably racist. A large portion of EAs rallying in support of someone who said that is therefore a surefire way for EA to lose members and funds.
It would be adaptive* for EA to have norms in favor of downvoting strongly taboo-breaking comments that have little to no utility-creating potential.
in an environment where prospective members and funders almost universally believe that saying “different races have different average IQs” is irredeemably racist
Is that true? I am skeptical. Notably, this seems to be controversial among the current membership (which is exactly what OP is complaining about!)
I am strongly confident that this is true. My prior is something like 99%. I can’t think of a single person I’ve met in real life (and I’ve been offline involved with political organizations, nonprofits, and a wide variety of intellectual communities) who wouldn’t consider “different races have different average IQs” to be prima facie racist. The number goes up only slightly for people I’ve encountered online (and more than half of them were encountered over the past few days).
Edit:
I think demonstrates just how disconnected some EAs are with mainstream social norms (I don’t mean that as an insult; being disconnected from mainstream norms has its benefits, though I think it’s bad in this specific case). Claiming a difference in intelligence between races is one of the worst things you can say in polite society in 2023. It’s pretty much on par with rape apologia or advocating for pedophilia. It’s worse than, say, advocating in support of torture.
99% is really too high. It’s more than 1% likely that you’re just in a very strong ideological filter bubble (which are surprisingly common; for example, I know very few Republican voters even though that’s roughly half the US). The fact that this is a strong social norm makes that more likely.
I already said this, but I don’t really understand how you can be so confident in this given the current controversy. It seems pretty clear that a sizeable fraction of current members don’t agree with “saying “different races have different average IQs” is irredeemably racist”. Doesn’t that disprove your claim? (current members are at least somewhat representative of prospective members)
I think a historical strength of EA has been its ability to draw from people disconnected from mainstream social norms, especially because there is less competition for such people.
I might be wrong! But I stand by it. I don’t believe myself to be in an ideological bubble. I grew up in the south, went to college in a highly rural area, and have friends across the political spectrum. Most of my friends from college are actually Republican, a few are even Trump supporters (honestly, I think they have some racial bias, but if you asked them “is saying white people have higher IQs than black people racist?” I’m highly confident they would say yes).
The current controversy is pretty easily explainable to me without updating my priors: the EA community has attracted a lot of high decoupler rationalists who don’t much care about mainstream norms (which again, is a virtue in many cases—but not this one).
Yeah, that explanation seems right. But—the high-decoupler rationalists are the counterexample to your claim! That group is valuable to EA, and EA should make sure it remains appealing to them (including the ones not currently in EA—the world will continue to produce high-decoupler rationalists). Which is not really consistent with the strong norm enforcement you’re advocating.
I think this is a decent argument, but I probably disagree. I think most high decouplers aren’t utilitarian or utilitarian-adjacent, and aren’t inclined to optimize for social welfare the way I think it is important for EA to. I have another comment arguing somewhat provocatively that rationalist transplants may harm the EA movement more than helping it by being motivated by norm-violative truth seeking over social welfare.
But as I say in the other post, I wouldn’t point out any individual rationalists/high-decouplers as bad for the movement; my argument is just about group averages ;)
FWIW, I’m highly longtermism skeptical for epistemic reasons, so I value the influx of people who care a lot of AGI alignment and whatnot much less than most people on here.
High decoupling truth seekers who wanted to do the most good were necessary for founding the movement. Now that it exists they aren’t. As time goes on it will more closely approach the normal charitable movement where fashion and status seeking are as important, if not more important than doing the most good. This was inevitable from the founding of the movement. Autists and true believers always lose to people who are primarily interested in power and social status eventually in every social movement. Getting ten or twenty years of extra good over the normal charitable complex before it degenerates into them is good.
https://meaningness.com/geeks-mops-sociopaths
Decoupling by definition ignores context. Context frequently has implications for social welfare. Utilitarian goals therefore cannot be served without contextualizing.
I also dispute the idea that the movement’s founders were high decoupler rationalists to the degree that we’re talking about here. While people like Singer and MacAskill aren’t afraid to break from norms when useful, and both (particularly Singer) have said some things I’ve winced at, I can’t imagine either saying anything remotely like Bostrom’s statement, nor thinking that defending it would be a good idea.
I can’t imagine Singer having a commitment to public truth telling like Bostrom’s either, because he’s published on the moral necessity of the Noble Lie[1]. If he believed something that he thought publicising would be net negative for utility he would definitely lie about it. I’m less sure that McAskill would lie if he thought it expedient for his goals but the idea that he’s not a high decoupler beggars belief. He’s so weird he came up with a highly fit new idea that attracts a bunch of attention and even real action from the intellectually inclined philosophy lovers. Normal people don’t do that. Normal people don’t try to apply their moral principles universally.
It’s hard for me to follow what you’re trying to communicate. Are you saying that high contextualizers don’t/can’t apply their morals universally while high decouplers can? I don’t see any reason to believe that. Are you saying that decouplers are more honest? I also don’t see any reason to believe that.
I’m saying decouplers are more likely to be honest with themselves, and if they value truth terminally, to be honest with others. Having multiple goals that trade off against each other in ways that are unclear to people who hold them, never mind anyone else, generally goes along with this kind of muddled thinking. You don’t get to be any good as an academic philosopher without being a high decoupler. It’s a necessity for being expert, never mind world class.
Peter Singer seems pretty clearly to be a utilitarian above and beyond any other moral considerations so if he thought lying would increase utility he’d do it. Nick Bostrom clearly values truth seeking over short term considerations. If you don’t value the longtermist element of EA that’s fine. Have fewer people die, more people live long, healthy , happy lives, perhaps care about animal welfare on somebody level. That’s plenty for the next 30 years. Truth as terminal value is not really relevant. Being right is however very important if you’re trying to get things right on the scale of 100 years, never mind anything longer than that. Not lying, even if it’s convenient, is important. The purge Bostrom faction almost certainly mostly believes that what they profess is true. But purging people for having different beliefs is not truth seeking behaviour. If you’re a longtermist you can’t do that. You can’t lie for political convenience unless you plan to do everything important in secret either because every area where your model and the world are different is an opportunity to fuck to when you’re very likely to do so anyway.
If all high context means is that Bostrom could have said the same thing but less bluntly fine. It seems extremely unlikely that those calling for his ouster would have found a statement that was more delicate with the same truth message any less offensive.
Humans evolved strong punishment norms for a reason.
While I normally value EA members’ willingness to break social norms, I’m finding myself wishing for the chilling effect of taboos and punishments right now. The forum has done incredible damage to itself over the last few days.
I am not proud of human history. Why do you want unkindness so much in this case?
Because I’m a consequentialist, and it seems like I need the the aforementioned norms to get good consequences (EA not doing irrepairable damage to its reputation) in this case.
Social organizations like EA face their own form of natural selection. EA competes for members and funding with other good-doing and intellectual communities, in an environment where prospective members and funders almost universally believe that saying “different races have different average IQs” is irredeemably racist. A large portion of EAs rallying in support of someone who said that is therefore a surefire way for EA to lose members and funds.
It would be adaptive* for EA to have norms in favor of downvoting strongly taboo-breaking comments that have little to no utility-creating potential.
*butchering the definition of that word a bit
Is that true? I am skeptical. Notably, this seems to be controversial among the current membership (which is exactly what OP is complaining about!)
I am strongly confident that this is true. My prior is something like 99%. I can’t think of a single person I’ve met in real life (and I’ve been offline involved with political organizations, nonprofits, and a wide variety of intellectual communities) who wouldn’t consider “different races have different average IQs” to be prima facie racist. The number goes up only slightly for people I’ve encountered online (and more than half of them were encountered over the past few days).
Edit:
I think demonstrates just how disconnected some EAs are with mainstream social norms (I don’t mean that as an insult; being disconnected from mainstream norms has its benefits, though I think it’s bad in this specific case). Claiming a difference in intelligence between races is one of the worst things you can say in polite society in 2023. It’s pretty much on par with rape apologia or advocating for pedophilia. It’s worse than, say, advocating in support of torture.
99% is really too high. It’s more than 1% likely that you’re just in a very strong ideological filter bubble (which are surprisingly common; for example, I know very few Republican voters even though that’s roughly half the US). The fact that this is a strong social norm makes that more likely.
I already said this, but I don’t really understand how you can be so confident in this given the current controversy. It seems pretty clear that a sizeable fraction of current members don’t agree with “saying “different races have different average IQs” is irredeemably racist”. Doesn’t that disprove your claim? (current members are at least somewhat representative of prospective members)
I think a historical strength of EA has been its ability to draw from people disconnected from mainstream social norms, especially because there is less competition for such people.
I might be wrong! But I stand by it. I don’t believe myself to be in an ideological bubble. I grew up in the south, went to college in a highly rural area, and have friends across the political spectrum. Most of my friends from college are actually Republican, a few are even Trump supporters (honestly, I think they have some racial bias, but if you asked them “is saying white people have higher IQs than black people racist?” I’m highly confident they would say yes).
The current controversy is pretty easily explainable to me without updating my priors: the EA community has attracted a lot of high decoupler rationalists who don’t much care about mainstream norms (which again, is a virtue in many cases—but not this one).
Yeah, that explanation seems right. But—the high-decoupler rationalists are the counterexample to your claim! That group is valuable to EA, and EA should make sure it remains appealing to them (including the ones not currently in EA—the world will continue to produce high-decoupler rationalists). Which is not really consistent with the strong norm enforcement you’re advocating.
I think this is a decent argument, but I probably disagree. I think most high decouplers aren’t utilitarian or utilitarian-adjacent, and aren’t inclined to optimize for social welfare the way I think it is important for EA to. I have another comment arguing somewhat provocatively that rationalist transplants may harm the EA movement more than helping it by being motivated by norm-violative truth seeking over social welfare.
But as I say in the other post, I wouldn’t point out any individual rationalists/high-decouplers as bad for the movement; my argument is just about group averages ;)
FWIW, I’m highly longtermism skeptical for epistemic reasons, so I value the influx of people who care a lot of AGI alignment and whatnot much less than most people on here.
High decoupling truth seekers who wanted to do the most good were necessary for founding the movement. Now that it exists they aren’t. As time goes on it will more closely approach the normal charitable movement where fashion and status seeking are as important, if not more important than doing the most good. This was inevitable from the founding of the movement. Autists and true believers always lose to people who are primarily interested in power and social status eventually in every social movement. Getting ten or twenty years of extra good over the normal charitable complex before it degenerates into them is good. https://meaningness.com/geeks-mops-sociopaths
Decoupling by definition ignores context. Context frequently has implications for social welfare. Utilitarian goals therefore cannot be served without contextualizing.
I also dispute the idea that the movement’s founders were high decoupler rationalists to the degree that we’re talking about here. While people like Singer and MacAskill aren’t afraid to break from norms when useful, and both (particularly Singer) have said some things I’ve winced at, I can’t imagine either saying anything remotely like Bostrom’s statement, nor thinking that defending it would be a good idea.
I can’t imagine Singer having a commitment to public truth telling like Bostrom’s either, because he’s published on the moral necessity of the Noble Lie[1]. If he believed something that he thought publicising would be net negative for utility he would definitely lie about it. I’m less sure that McAskill would lie if he thought it expedient for his goals but the idea that he’s not a high decoupler beggars belief. He’s so weird he came up with a highly fit new idea that attracts a bunch of attention and even real action from the intellectually inclined philosophy lovers. Normal people don’t do that. Normal people don’t try to apply their moral principles universally.
[1] https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-9329.2009.00449.x
https://betonit.substack.com/p/singer-and-the-noble-lie
Lying to meet goals != contextualizing
It’s hard for me to follow what you’re trying to communicate. Are you saying that high contextualizers don’t/can’t apply their morals universally while high decouplers can? I don’t see any reason to believe that. Are you saying that decouplers are more honest? I also don’t see any reason to believe that.
I’m saying decouplers are more likely to be honest with themselves, and if they value truth terminally, to be honest with others. Having multiple goals that trade off against each other in ways that are unclear to people who hold them, never mind anyone else, generally goes along with this kind of muddled thinking. You don’t get to be any good as an academic philosopher without being a high decoupler. It’s a necessity for being expert, never mind world class.
Peter Singer seems pretty clearly to be a utilitarian above and beyond any other moral considerations so if he thought lying would increase utility he’d do it. Nick Bostrom clearly values truth seeking over short term considerations. If you don’t value the longtermist element of EA that’s fine. Have fewer people die, more people live long, healthy , happy lives, perhaps care about animal welfare on somebody level. That’s plenty for the next 30 years. Truth as terminal value is not really relevant. Being right is however very important if you’re trying to get things right on the scale of 100 years, never mind anything longer than that. Not lying, even if it’s convenient, is important. The purge Bostrom faction almost certainly mostly believes that what they profess is true. But purging people for having different beliefs is not truth seeking behaviour. If you’re a longtermist you can’t do that. You can’t lie for political convenience unless you plan to do everything important in secret either because every area where your model and the world are different is an opportunity to fuck to when you’re very likely to do so anyway.
If all high context means is that Bostrom could have said the same thing but less bluntly fine. It seems extremely unlikely that those calling for his ouster would have found a statement that was more delicate with the same truth message any less offensive.