Yeah, that explanation seems right. But—the high-decoupler rationalists are the counterexample to your claim! That group is valuable to EA, and EA should make sure it remains appealing to them (including the ones not currently in EA—the world will continue to produce high-decoupler rationalists). Which is not really consistent with the strong norm enforcement you’re advocating.
I think this is a decent argument, but I probably disagree. I think most high decouplers aren’t utilitarian or utilitarian-adjacent, and aren’t inclined to optimize for social welfare the way I think it is important for EA to. I have another comment arguing somewhat provocatively that rationalist transplants may harm the EA movement more than helping it by being motivated by norm-violative truth seeking over social welfare.
But as I say in the other post, I wouldn’t point out any individual rationalists/high-decouplers as bad for the movement; my argument is just about group averages ;)
FWIW, I’m highly longtermism skeptical for epistemic reasons, so I value the influx of people who care a lot of AGI alignment and whatnot much less than most people on here.
High decoupling truth seekers who wanted to do the most good were necessary for founding the movement. Now that it exists they aren’t. As time goes on it will more closely approach the normal charitable movement where fashion and status seeking are as important, if not more important than doing the most good. This was inevitable from the founding of the movement. Autists and true believers always lose to people who are primarily interested in power and social status eventually in every social movement. Getting ten or twenty years of extra good over the normal charitable complex before it degenerates into them is good.
https://meaningness.com/geeks-mops-sociopaths
Decoupling by definition ignores context. Context frequently has implications for social welfare. Utilitarian goals therefore cannot be served without contextualizing.
I also dispute the idea that the movement’s founders were high decoupler rationalists to the degree that we’re talking about here. While people like Singer and MacAskill aren’t afraid to break from norms when useful, and both (particularly Singer) have said some things I’ve winced at, I can’t imagine either saying anything remotely like Bostrom’s statement, nor thinking that defending it would be a good idea.
I can’t imagine Singer having a commitment to public truth telling like Bostrom’s either, because he’s published on the moral necessity of the Noble Lie[1]. If he believed something that he thought publicising would be net negative for utility he would definitely lie about it. I’m less sure that McAskill would lie if he thought it expedient for his goals but the idea that he’s not a high decoupler beggars belief. He’s so weird he came up with a highly fit new idea that attracts a bunch of attention and even real action from the intellectually inclined philosophy lovers. Normal people don’t do that. Normal people don’t try to apply their moral principles universally.
It’s hard for me to follow what you’re trying to communicate. Are you saying that high contextualizers don’t/can’t apply their morals universally while high decouplers can? I don’t see any reason to believe that. Are you saying that decouplers are more honest? I also don’t see any reason to believe that.
I’m saying decouplers are more likely to be honest with themselves, and if they value truth terminally, to be honest with others. Having multiple goals that trade off against each other in ways that are unclear to people who hold them, never mind anyone else, generally goes along with this kind of muddled thinking. You don’t get to be any good as an academic philosopher without being a high decoupler. It’s a necessity for being expert, never mind world class.
Peter Singer seems pretty clearly to be a utilitarian above and beyond any other moral considerations so if he thought lying would increase utility he’d do it. Nick Bostrom clearly values truth seeking over short term considerations. If you don’t value the longtermist element of EA that’s fine. Have fewer people die, more people live long, healthy , happy lives, perhaps care about animal welfare on somebody level. That’s plenty for the next 30 years. Truth as terminal value is not really relevant. Being right is however very important if you’re trying to get things right on the scale of 100 years, never mind anything longer than that. Not lying, even if it’s convenient, is important. The purge Bostrom faction almost certainly mostly believes that what they profess is true. But purging people for having different beliefs is not truth seeking behaviour. If you’re a longtermist you can’t do that. You can’t lie for political convenience unless you plan to do everything important in secret either because every area where your model and the world are different is an opportunity to fuck to when you’re very likely to do so anyway.
If all high context means is that Bostrom could have said the same thing but less bluntly fine. It seems extremely unlikely that those calling for his ouster would have found a statement that was more delicate with the same truth message any less offensive.
Yeah, that explanation seems right. But—the high-decoupler rationalists are the counterexample to your claim! That group is valuable to EA, and EA should make sure it remains appealing to them (including the ones not currently in EA—the world will continue to produce high-decoupler rationalists). Which is not really consistent with the strong norm enforcement you’re advocating.
I think this is a decent argument, but I probably disagree. I think most high decouplers aren’t utilitarian or utilitarian-adjacent, and aren’t inclined to optimize for social welfare the way I think it is important for EA to. I have another comment arguing somewhat provocatively that rationalist transplants may harm the EA movement more than helping it by being motivated by norm-violative truth seeking over social welfare.
But as I say in the other post, I wouldn’t point out any individual rationalists/high-decouplers as bad for the movement; my argument is just about group averages ;)
FWIW, I’m highly longtermism skeptical for epistemic reasons, so I value the influx of people who care a lot of AGI alignment and whatnot much less than most people on here.
High decoupling truth seekers who wanted to do the most good were necessary for founding the movement. Now that it exists they aren’t. As time goes on it will more closely approach the normal charitable movement where fashion and status seeking are as important, if not more important than doing the most good. This was inevitable from the founding of the movement. Autists and true believers always lose to people who are primarily interested in power and social status eventually in every social movement. Getting ten or twenty years of extra good over the normal charitable complex before it degenerates into them is good. https://meaningness.com/geeks-mops-sociopaths
Decoupling by definition ignores context. Context frequently has implications for social welfare. Utilitarian goals therefore cannot be served without contextualizing.
I also dispute the idea that the movement’s founders were high decoupler rationalists to the degree that we’re talking about here. While people like Singer and MacAskill aren’t afraid to break from norms when useful, and both (particularly Singer) have said some things I’ve winced at, I can’t imagine either saying anything remotely like Bostrom’s statement, nor thinking that defending it would be a good idea.
I can’t imagine Singer having a commitment to public truth telling like Bostrom’s either, because he’s published on the moral necessity of the Noble Lie[1]. If he believed something that he thought publicising would be net negative for utility he would definitely lie about it. I’m less sure that McAskill would lie if he thought it expedient for his goals but the idea that he’s not a high decoupler beggars belief. He’s so weird he came up with a highly fit new idea that attracts a bunch of attention and even real action from the intellectually inclined philosophy lovers. Normal people don’t do that. Normal people don’t try to apply their moral principles universally.
[1] https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-9329.2009.00449.x
https://betonit.substack.com/p/singer-and-the-noble-lie
Lying to meet goals != contextualizing
It’s hard for me to follow what you’re trying to communicate. Are you saying that high contextualizers don’t/can’t apply their morals universally while high decouplers can? I don’t see any reason to believe that. Are you saying that decouplers are more honest? I also don’t see any reason to believe that.
I’m saying decouplers are more likely to be honest with themselves, and if they value truth terminally, to be honest with others. Having multiple goals that trade off against each other in ways that are unclear to people who hold them, never mind anyone else, generally goes along with this kind of muddled thinking. You don’t get to be any good as an academic philosopher without being a high decoupler. It’s a necessity for being expert, never mind world class.
Peter Singer seems pretty clearly to be a utilitarian above and beyond any other moral considerations so if he thought lying would increase utility he’d do it. Nick Bostrom clearly values truth seeking over short term considerations. If you don’t value the longtermist element of EA that’s fine. Have fewer people die, more people live long, healthy , happy lives, perhaps care about animal welfare on somebody level. That’s plenty for the next 30 years. Truth as terminal value is not really relevant. Being right is however very important if you’re trying to get things right on the scale of 100 years, never mind anything longer than that. Not lying, even if it’s convenient, is important. The purge Bostrom faction almost certainly mostly believes that what they profess is true. But purging people for having different beliefs is not truth seeking behaviour. If you’re a longtermist you can’t do that. You can’t lie for political convenience unless you plan to do everything important in secret either because every area where your model and the world are different is an opportunity to fuck to when you’re very likely to do so anyway.
If all high context means is that Bostrom could have said the same thing but less bluntly fine. It seems extremely unlikely that those calling for his ouster would have found a statement that was more delicate with the same truth message any less offensive.