Paul hello. I think I’m missing something crucial in your argument. You are saying that investing in technological progress is worse than… what? I understand that there are black swan technologies which create X-risk therefore we want to invest into safety research rather than into speeding up the technology itself. Are you saying anything besides that? In a universe without risk, would you still be against prioritizing progress?
Why is the long term asymptotics crucial to the question? Yes, by accelerating progress you are “merely” translating everything in time. So what? The result of skipping from 1800 to 1900 is immense utility gain whatever is the long term scenario.
How about “pragmatic altruist”? It conveys the idea we want to do good but we approach it rationally rather than emotionally. Also it conveys the idea we want people to do things that are reasonable (like donating 10% of their income) rather than things that most people wouldn’t consider to be reasonable (like donating everything except a minimal subsistence wage).