Expected value seems very important. It underlies a lot of other important concepts, is relevant to both neartermism and longtermism, and is extremely frequently brought up in EA discussions and arguments.
I’ve come, through the joking to serious pipeline, to telling people that EAs are just people who are really excited about multiplication, and who think multiplication is epistemically and morally sound.
Expected value seems very important. It underlies a lot of other important concepts, is relevant to both neartermism and longtermism, and is extremely frequently brought up in EA discussions and arguments.
I’ve come, through the joking to serious pipeline, to telling people that EAs are just people who are really excited about multiplication, and who think multiplication is epistemically and morally sound.
I think this is right, and its prevalence maybe the single most important difference between EA and the rest of the world.