Here is my attempt at summarizing the main points:
In What We Owe the Future, MacAskill agrees with other longtermists about the moral importance of the long-term future, but disagrees with most of them about how best to affect it. Relative to other longtermists, MacAskill thinks that affecting societal values is more important and preventing AI-triggered extinction is less important. Also, MacAskill’s recommendations for how to influence the long-term future seem to have been researched less thoroughly than other parts of the book.
Here is my attempt at summarizing the main points:
In What We Owe the Future, MacAskill agrees with other longtermists about the moral importance of the long-term future, but disagrees with most of them about how best to affect it. Relative to other longtermists, MacAskill thinks that affecting societal values is more important and preventing AI-triggered extinction is less important. Also, MacAskill’s recommendations for how to influence the long-term future seem to have been researched less thoroughly than other parts of the book.