On (2), while it’s obviously rhetorically slanted, isn’t that a fair framing of longtermism? They do care more about the gazillions of future lives than the smaller number of present ones and they seem to understand that this is not aligned with popular intuitions on the subject.
I am not sure longtermism is compatible with good PR or having ordinary people immediately grok its conclusions on intuition alone…
(Or was your problem more that this misrepresents the actual funding allocations, in which case I wholeheartedly agree 💙)
Maybe I’m naive, but I think “care more about “asteroids and killer robots” than we care about the 700M people currently in poverty” is a terrible framing of Longtermism. Maybe it’s technically correct one level but all compassion and nuance is lost.
Longtermists care deeply about the 700 million people in poverty, and many of them have already given much of their life and money towards ending that poverty. They also see the huge future potential of humanity and see value in building and safeguarding a better future for their children and their children’s children. They’ve decided to devote their life to that better future, while many other effective altruists continue to raise funds and forge solutions to alleviate current global poverty.
Longtermism is compatible with good PR, I think we just often do it badly.
Overall this might be another tragic loss of a high profile smart guy who used to be at least a soft EA supporter.
I think I agree with the general point you’re making, but I specifically I disagree that the longtermist project is incompatible with good PR, and that it doesn’t appeal to common moral intuition (eg people do care about climate change, nuclear war, rogue AI, deadly pandemics).
eg people do care about climate change, nuclear war, rogue AI, deadly pandemics
But those things are also important without longtermism. So you can make non-longtermist PR for things that longtermists like, but it would feel dishonest if you hide the part with large numbers of people millions of years into the future.
On (2), while it’s obviously rhetorically slanted, isn’t that a fair framing of longtermism? They do care more about the gazillions of future lives than the smaller number of present ones and they seem to understand that this is not aligned with popular intuitions on the subject.
I am not sure longtermism is compatible with good PR or having ordinary people immediately grok its conclusions on intuition alone…
(Or was your problem more that this misrepresents the actual funding allocations, in which case I wholeheartedly agree 💙)
Maybe I’m naive, but I think “care more about “asteroids and killer robots” than we care about the 700M people currently in poverty” is a terrible framing of Longtermism. Maybe it’s technically correct one level but all compassion and nuance is lost.
Longtermists care deeply about the 700 million people in poverty, and many of them have already given much of their life and money towards ending that poverty. They also see the huge future potential of humanity and see value in building and safeguarding a better future for their children and their children’s children. They’ve decided to devote their life to that better future, while many other effective altruists continue to raise funds and forge solutions to alleviate current global poverty.
Longtermism is compatible with good PR, I think we just often do it badly.
Overall this might be another tragic loss of a high profile smart guy who used to be at least a soft EA supporter.
I think I agree with the general point you’re making, but I specifically I disagree that the longtermist project is incompatible with good PR, and that it doesn’t appeal to common moral intuition (eg people do care about climate change, nuclear war, rogue AI, deadly pandemics).
But those things are also important without longtermism. So you can make non-longtermist PR for things that longtermists like, but it would feel dishonest if you hide the part with large numbers of people millions of years into the future.