I’m curious[1] if people observe any differences of opinion that seem to come up a bit too often to be random. I think I’ve noticed at least three—my impression is that, compared to EAs, rationalists on the whole:
Care less about the welfare of nonhumans
Have more faith in unvarnished honesty as a mechanism for collective truth-seeking
Are more open-minded about weird or controversial ideas
I’d love to hear if anyone disagrees or has other observations.[2]
Philosophically, I think rationalists relatively:
Tend to be more focused on complexity of value; hedonic utilitarianism is rarer among them
This is reflected in both differing visions of the future, and being less drawn to animal welfare.
Tend to have a more continuous + computational view of personal identity
This is reflected in e.g., interest in antiaging and cryonics
Are less likely to be causal decision theorists and/or think on the margin
Sociologically, I think they:
Tend to be more neurodiverse (particularly on the autism spectrum)
Also have a greater incidence of mental health issues
Have greater class diversity (especially when you look at the current ~20 year olds in EA, I feel like they almost uniformly come from elite colleges, at least in the US)
Are more male
Are more white
Rely more on first-principles thinking and verbal logic, less on empirical data and fast BOTECs
Are less deferential
Are less prestige-conscious
Are more intelligence-conscious
These are excellent, cheers!
Although I would have said “less prestige-seeking; more prestige-conscious” (as in, they talk about it a lot, but tend to be quite scrupulous about not seeking it for themselves and are sceptical of its usefulness)
The Rationality Community has a far lower focus on morality, and has members which are amoral or completely selfish. I’ll go out on a limb and claim that they also have a broader set of interests, since there is less of a restriction on what attention can be focused on (EA wants to do good, the rationality community is interested in truth, and truth can be found basically about anything).
Purely from online observation (and I’ll admit I’m biased towards the EA side of things here):
EA is less reliant on “the sequences” and other pop-science blog posts when supporting their claims
EA tends to be more deferential to expert opinion and academic research when evaluating claims.
rationalists tend to have higher estimates of AI doom chances.
EA tends to be more sympathetic to social justice issues.
Any rationalists around who could answer this question? Crosspost to lesswrong?
Consider myself more culturally rationalist than EA, my (short) answer above. The real answer is 10k words and probably not worth the effort per insight/importance.
Thanks good to know, Love the rationalist answer :)