No need to steelman—there are good arguments against this and it’s highly nonobvious what % of EA effort should be on longtermism, even from the perspective of longtermism. Some arguments:
If longtermism is wrong (see another answer for more on this)
If getting a lot of short-run wins is important to have long-run influence
If longtermism is just too many inferential steps away from existing common-sense, and if more people would therefore get into longtermism if there were more focus on short-term wins
If now isn’t the right time for longtermism (because there isn’t enough to do) and instead it would be better if there were a push around longtermism at some time in the future
I think all these considerations are significant, and are part of why I’m in favour of EA having a diversity of causes and worldviews. (Though not necessarily on the ‘three cause area’ breakdown which we currently have, which I think is a bit narrow).
If now isn’t the right time for longtermism (because there isn’t enough to do) and instead it would be better if there were a push around longtermism at some time in the future
Have you thought about whether there’s a way you could write your book on longtermism to make it robustly beneficial even if it turns out that it’s not yet a good time for a push around longtermism?
No need to steelman—there are good arguments against this and it’s highly nonobvious what % of EA effort should be on longtermism, even from the perspective of longtermism. Some arguments:
If longtermism is wrong (see another answer for more on this)
If getting a lot of short-run wins is important to have long-run influence
If longtermism is just too many inferential steps away from existing common-sense, and if more people would therefore get into longtermism if there were more focus on short-term wins
If now isn’t the right time for longtermism (because there isn’t enough to do) and instead it would be better if there were a push around longtermism at some time in the future
I think all these considerations are significant, and are part of why I’m in favour of EA having a diversity of causes and worldviews. (Though not necessarily on the ‘three cause area’ breakdown which we currently have, which I think is a bit narrow).
Have you thought about whether there’s a way you could write your book on longtermism to make it robustly beneficial even if it turns out that it’s not yet a good time for a push around longtermism?