I disagree. It seems to me that the EA community’s strength, goodness, and power lie almost entirely in our ability to reason well (so as to be actually be “effective”, rather than merely tribal/random). It lies in our ability to trust in the integrity of one anothers’ speech and reasoning, and to talk together to figure out what’s true.
Finding the real leverage points in the world is probably worth orders of magnitude in our impact. Our ability to think honestly and speak accurately and openly with each other seems to me to be a key part of how we access those “orders of magnitude of impact.”
In contrast, our ability to have more money/followers/etc. (via not ending up on the wrong side of a cultural revolution, etc.) seems to me to be worth… something, in expectation, but not as much as our ability to think and speak together is worth.
(There’s a lot to work out here, in terms of trying to either do the estimates in EV terms, or trying to work out the decision theory / virtue ethics of the matter. I would love to try to discuss in detail, back and forth, and see if we can work this out. I do not think this should be super obvious in either direction from the get go, although at this point my opinion is pretty strongly in the direction I am naming. Please do discuss if you’re up for it.)
First of all, thanks so much for your time for providing an insightful (and poetic!) comment.
It seems to me that the EA community’s strength, goodness, and power lie almost entirely in our ability to reason well
Mostly agreed. I think “reasoning well” hides a lot of details though, eg. a lot of the time people reason poorly due to specific incentives than because of their general inability to reason.
Finding the real leverage points in the world is probably worth orders of magnitude in our impact.
Agreed
Our ability to think honestly and speak accurately and openly with each other seems to me to be a key part of how we access those “orders of magnitude of impact.”
Agreed, but I think the more relevant question is whether the expected harm of being up against the wall in a cultural revolution is likely to hinder our accuracy more than the expected accuracy loss of some selective self-censorship, particularly in public.
I do find The Weapon of Openness moderately persuasive as a counterargument, as well as the empirical results of the pro- and anti- censorship questions raised around covid.
In contrast, our ability to have more money/followers/etc. (via not ending up on the wrong side of a cultural revolution, etc.) seems to me to be worth… something, in expectation
I think you’re being really cavalier about being on the wrong side of the cultural revolution. Maybe the revolution will be light, or it won’t happen at all, but if we’re on the wrong side of a cultural revolution half as big as China’s, I think the movement de facto cannot exist in the Anglophone world if we’re seen to be as the wrong side.
I also think you’re maybe modeling this as me proposing that we as a community strongly side with the winning side here and try to acquire power and influence that way, which I empathetically am not. Instead I’m mostly proposing that most of us treat the possibility of a cultural revolution like the weather, and don’t fight hurricanes until we understand geo-engineering much better.
I’m leaving open the possibility that a small number of us should try to be on either side of this, whether in a “vote your conscience” way or because they individually think they want resources or whatever, but by default I think our movement is best protected by not trying to acquire lots of political power or to fight revolutions.
I would love to try to discuss in detail, back and forth, and see if we can work this out.
I will try my best to talk about this more, but I can’t promise I’ll respond. I’m both pretty busy with work and (this is closer to my true rejection) find talking about these concepts kinda emotionally exhausting.
Hi, this is meant to be a reply to your reply to Anna. Please post it for me. [...]
Agreed that Anna seems to be misinterpreting you or not addressing your main point. The biggest question in my mind is whether EA will be on the wrong side of the revolution anyway, because we’re an ideological competitor and a bundle of resources that can be expropriated. Even if that’s the case though, maybe we still have to play the odds and just hope to fly under the radar somehow.
Seems like hiring some history professors as consultants might be a good use of money for EA orgs at this point. It would be really helpful to have answers to questions like: Did any society ever manage to stop a cultural revolution after it has progressed to a stage analogous to the current one, and if so how (aside from letting it exhaust itself)? From historical precedent can we predict whether EA will be targeted? Were there relatively small groups that managed to survive these revolutions with their people/culture/property/relationships intact and if so how?
FWIW, I don’t think a cultural revolution is very likely, just likely enough (>1%) that we shouldn’t only think about object-level considerations when deciding whether to sign a petition or speak out publicly in support of someone.
I also suspect history professors will not be able to answer this honestly and dispassionately in worlds where a cultural revolution is likely.
I disagree. It seems to me that the EA community’s strength, goodness, and power lie almost entirely in our ability to reason well (so as to be actually be “effective”, rather than merely tribal/random). It lies in our ability to trust in the integrity of one anothers’ speech and reasoning, and to talk together to figure out what’s true.
Finding the real leverage points in the world is probably worth orders of magnitude in our impact. Our ability to think honestly and speak accurately and openly with each other seems to me to be a key part of how we access those “orders of magnitude of impact.”
In contrast, our ability to have more money/followers/etc. (via not ending up on the wrong side of a cultural revolution, etc.) seems to me to be worth… something, in expectation, but not as much as our ability to think and speak together is worth.
(There’s a lot to work out here, in terms of trying to either do the estimates in EV terms, or trying to work out the decision theory / virtue ethics of the matter. I would love to try to discuss in detail, back and forth, and see if we can work this out. I do not think this should be super obvious in either direction from the get go, although at this point my opinion is pretty strongly in the direction I am naming. Please do discuss if you’re up for it.)
First of all, thanks so much for your time for providing an insightful (and poetic!) comment.
Mostly agreed. I think “reasoning well” hides a lot of details though, eg. a lot of the time people reason poorly due to specific incentives than because of their general inability to reason.
Agreed
Agreed, but I think the more relevant question is whether the expected harm of being up against the wall in a cultural revolution is likely to hinder our accuracy more than the expected accuracy loss of some selective self-censorship, particularly in public.
I do find The Weapon of Openness moderately persuasive as a counterargument, as well as the empirical results of the pro- and anti- censorship questions raised around covid.
I think you’re being really cavalier about being on the wrong side of the cultural revolution. Maybe the revolution will be light, or it won’t happen at all, but if we’re on the wrong side of a cultural revolution half as big as China’s, I think the movement de facto cannot exist in the Anglophone world if we’re seen to be as the wrong side.
I also think you’re maybe modeling this as me proposing that we as a community strongly side with the winning side here and try to acquire power and influence that way, which I empathetically am not. Instead I’m mostly proposing that most of us treat the possibility of a cultural revolution like the weather, and don’t fight hurricanes until we understand geo-engineering much better.
I’m leaving open the possibility that a small number of us should try to be on either side of this, whether in a “vote your conscience” way or because they individually think they want resources or whatever, but by default I think our movement is best protected by not trying to acquire lots of political power or to fight revolutions.
I will try my best to talk about this more, but I can’t promise I’ll respond. I’m both pretty busy with work and (this is closer to my true rejection) find talking about these concepts kinda emotionally exhausting.
I received this as a private message:
FWIW, I don’t think a cultural revolution is very likely, just likely enough (>1%) that we shouldn’t only think about object-level considerations when deciding whether to sign a petition or speak out publicly in support of someone.
I also suspect history professors will not be able to answer this honestly and dispassionately in worlds where a cultural revolution is likely.
I don’t think the above reply is supposed to be pasted twice?
Fixed, thank you!