When I came to university I had already read a lot of the Sequences …
You’d read the Sequences but you thought we were a cult? Inconceivable!
(/sarcasm)
Oddly, while I agree with much of this post (and strong upvoted), it reads to me as evidencing many of the problems it describes! Almost of the elements that make EA seem culty seem to me to hail from the rationality side of the movement: Pascalian reasoning, in-group jargon, hero worship, or rather epistemic deferral to heroes and to holy texts, and eschatology (tithes being the one counterexample I can think of), all of which I see in the OP.
I don’t know what conclusion one is supposed to draw from this, but it disposes me both toward agreeing with your critique and toward greater scepticism that following your recommendations would do much to fix the problem.
I also don’t have any great answers, but I do strongly feel that one can be an extremely valuable EA without having heard of the sequences. I understand the efficiency of jargon, but I think in 90% of EA conversations where I hear it used, communicating more literally would have outweighed the efficiency loss—and that’s without considering the improved social signal eschewing jargon.
As a side note, I wondered while reading this if targeting people at university with such high priority could in itself be a core part of the problem, regardless of how it’s done. What other types of social group have relatively old leaders but primarily target under-20s? I think the answer is ‘plenty of fairly harmless ones’, such as sports enthusiasts, but if that was all I knew about a group it would certainly increase my prior that they were sinister. So maybe another approach worth looking into more is greater recruitment efforts towards older people. I doubt it’s as cost effective unless you think GAI is at most a decade or two away, but it might send an important signal that we’re not just about preying on teenagers.
Almost of the elements that make EA seem culty seem to me to hail from the rationality side of the movement: Pascalian reasoning, in-group jargon, hero worship, or rather epistemic deferral to heroes and to holy texts, and eschatology
The hero worship is I think especially concerning and is a striking way that implicit/”revealed” norms contradict explicit epistemic norms for some EAs
You’d read the Sequences but you thought we were a cult? Inconceivable!
(/sarcasm)
Oddly, while I agree with much of this post (and strong upvoted), it reads to me as evidencing many of the problems it describes! Almost of the elements that make EA seem culty seem to me to hail from the rationality side of the movement: Pascalian reasoning, in-group jargon, hero worship, or rather epistemic deferral to heroes and to holy texts, and eschatology (tithes being the one counterexample I can think of), all of which I see in the OP.
I don’t know what conclusion one is supposed to draw from this, but it disposes me both toward agreeing with your critique and toward greater scepticism that following your recommendations would do much to fix the problem.
I also don’t have any great answers, but I do strongly feel that one can be an extremely valuable EA without having heard of the sequences. I understand the efficiency of jargon, but I think in 90% of EA conversations where I hear it used, communicating more literally would have outweighed the efficiency loss—and that’s without considering the improved social signal eschewing jargon.
As a side note, I wondered while reading this if targeting people at university with such high priority could in itself be a core part of the problem, regardless of how it’s done. What other types of social group have relatively old leaders but primarily target under-20s? I think the answer is ‘plenty of fairly harmless ones’, such as sports enthusiasts, but if that was all I knew about a group it would certainly increase my prior that they were sinister. So maybe another approach worth looking into more is greater recruitment efforts towards older people. I doubt it’s as cost effective unless you think GAI is at most a decade or two away, but it might send an important signal that we’re not just about preying on teenagers.
The hero worship is I think especially concerning and is a striking way that implicit/”revealed” norms contradict explicit epistemic norms for some EAs