Paul hello. I think I’m missing something crucial in your argument. You are saying that investing in technological progress is worse than… what? I understand that there are black swan technologies which create X-risk therefore we want to invest into safety research rather than into speeding up the technology itself. Are you saying anything besides that? In a universe without risk, would you still be against prioritizing progress?
Why is the long term asymptotics crucial to the question? Yes, by accelerating progress you are “merely” translating everything in time. So what? The result of skipping from 1800 to 1900 is immense utility gain whatever is the long term scenario.
I think the point is: improving general (intellectual) progress is significantly less effective than improving differential (=specific) intellectual progress. Am I right?
Paul hello. I think I’m missing something crucial in your argument. You are saying that investing in technological progress is worse than… what? I understand that there are black swan technologies which create X-risk therefore we want to invest into safety research rather than into speeding up the technology itself. Are you saying anything besides that? In a universe without risk, would you still be against prioritizing progress?
Why is the long term asymptotics crucial to the question? Yes, by accelerating progress you are “merely” translating everything in time. So what? The result of skipping from 1800 to 1900 is immense utility gain whatever is the long term scenario.
I think the point is: improving general (intellectual) progress is significantly less effective than improving differential (=specific) intellectual progress. Am I right?