Purely from online observation (and I’ll admit I’m biased towards the EA side of things here):
EA is less reliant on “the sequences” and other pop-science blog posts when supporting their claims
EA tends to be more deferential to expert opinion and academic research when evaluating claims.
rationalists tend to have higher estimates of AI doom chances.
EA tends to be more sympathetic to social justice issues.
Purely from online observation (and I’ll admit I’m biased towards the EA side of things here):
EA is less reliant on “the sequences” and other pop-science blog posts when supporting their claims
EA tends to be more deferential to expert opinion and academic research when evaluating claims.
rationalists tend to have higher estimates of AI doom chances.
EA tends to be more sympathetic to social justice issues.