I think there is a very clear split, but it’s not over whether people want to do the most good or not. I would say the real split is between “empiricists” and “rationalists”, and it’s about how much actual certainty we should have before we devote our time and money to a cause.
The thing that made me supportive of EA was the rigorous research that went into cause areas. We have rigorous, peer-reviewed studies that definitively prove that malaria nets save lives. There is a real, tangible empirical proof that your donation to a givewell cause does real, empirical good. There is plenty of uncertainty in these cause areas, but they are relatively bounded by the data available.
Longtermism, on the other hand, is inherently built on shakier grounds, because you are speculating on unbounded problems that could have wildly different estimates depending on your own personal biases. Rationalists think you can overcome this by thinking really hard about the problems and extrapolating from current experience into the far future, or into things that don’t exist yet like AGI.
You can probably tell that I’m an empiricist, and I find that the so called “rationalists” have laid their foundations on a pile of shaky and questionable assumptions that I don’t agree with. That doesn’t mean I don’t care about the long term, for example climate change risk is very well studied.
I think there is a very clear split, but it’s not over whether people want to do the most good or not. I would say the real split is between “empiricists” and “rationalists”, and it’s about how much actual certainty we should have before we devote our time and money to a cause.
The thing that made me supportive of EA was the rigorous research that went into cause areas. We have rigorous, peer-reviewed studies that definitively prove that malaria nets save lives. There is a real, tangible empirical proof that your donation to a givewell cause does real, empirical good. There is plenty of uncertainty in these cause areas, but they are relatively bounded by the data available.
Longtermism, on the other hand, is inherently built on shakier grounds, because you are speculating on unbounded problems that could have wildly different estimates depending on your own personal biases. Rationalists think you can overcome this by thinking really hard about the problems and extrapolating from current experience into the far future, or into things that don’t exist yet like AGI.
You can probably tell that I’m an empiricist, and I find that the so called “rationalists” have laid their foundations on a pile of shaky and questionable assumptions that I don’t agree with. That doesn’t mean I don’t care about the long term, for example climate change risk is very well studied.