Purely from online observation (and I’ll admit I’m biased towards the EA side of things here):
EA is less reliant on “the sequences” and other pop-science blog posts when supporting their claims
EA tends to be more deferential to expert opinion and academic research when evaluating claims.
rationalists tend to have higher estimates of AI doom chances.
EA tends to be more sympathetic to social justice issues.
Current theme: default
Less Wrong (text)
Less Wrong (link)
Arrow keys: Next/previous image
Escape or click: Hide zoomed image
Space bar: Reset image size & position
Scroll to zoom in/out
(When zoomed in, drag to pan; double-click to close)
Keys shown in yellow (e.g., ]) are accesskeys, and require a browser-specific modifier key (or keys).
]
Keys shown in grey (e.g., ?) do not require any modifier keys.
?
Esc
h
f
a
m
v
c
r
q
t
u
o
,
.
/
s
n
e
;
Enter
[
\
k
i
l
=
-
0
′
1
2
3
4
5
6
7
8
9
→
↓
←
↑
Space
x
z
`
g
Purely from online observation (and I’ll admit I’m biased towards the EA side of things here):
EA is less reliant on “the sequences” and other pop-science blog posts when supporting their claims
EA tends to be more deferential to expert opinion and academic research when evaluating claims.
rationalists tend to have higher estimates of AI doom chances.
EA tends to be more sympathetic to social justice issues.