People in general, and not just longtermist altruists, have reason to be concerned with extinction. It may turn out not to be a problem or not be solvable and so the marginal impact seems questionable here. In contrast, few people are thinking about how to navigate our way to a worthwhile future. There are many places where thoughtful people might influence decisions that effectively lock us into a trajectory.
few people are thinking about how to navigate our way to a worthwhile future.
This might be true on the kinds of scales EAs are thinking about (potentially enourmous value, long time horizons) but is it not the case that many people want to steer humanity in a better direction? E.g. the Left, environmentalists, libertarians, … ~all political movements?
I worry EAs think of this as some unique and obscure thing to think about, when it isn’t.
(on the other hand, people neglect small probabilities of disastrous outcomes)
Lots of people think about how to improve the future in very traditional ways. Assuming the world keeps operating under the laws it has been for the past 50 years, how do we steer it in a better direction?
I suppose I was thinking of this in terms of taking radical changes from technology development seriously, but not in the sense of long timelines or weird sources of value. Far fewer people are thinking about how to navigate a time when AGI becomes commonplace than are thinking about how to get to that place, even though there might not be a huge window of time between them.
People in general, and not just longtermist altruists, have reason to be concerned with extinction. It may turn out not to be a problem or not be solvable and so the marginal impact seems questionable here. In contrast, few people are thinking about how to navigate our way to a worthwhile future. There are many places where thoughtful people might influence decisions that effectively lock us into a trajectory.
This might be true on the kinds of scales EAs are thinking about (potentially enourmous value, long time horizons) but is it not the case that many people want to steer humanity in a better direction? E.g. the Left, environmentalists, libertarians, … ~all political movements?
I worry EAs think of this as some unique and obscure thing to think about, when it isn’t.
(on the other hand, people neglect small probabilities of disastrous outcomes)
Lots of people think about how to improve the future in very traditional ways. Assuming the world keeps operating under the laws it has been for the past 50 years, how do we steer it in a better direction?
I suppose I was thinking of this in terms of taking radical changes from technology development seriously, but not in the sense of long timelines or weird sources of value. Far fewer people are thinking about how to navigate a time when AGI becomes commonplace than are thinking about how to get to that place, even though there might not be a huge window of time between them.