Would you annihilate all life on Earth, if somehow you could? Are you just anti-Mars or also anti-Earth? Your argument seems to be that life itself is bad, so I’m trying to see how far that goes and where it takes you.
I’m a negative leaning utilitarian but not a negative utilitarian—I think happiness matters and that a utopia is at least marginally better than the absence of life. But I also recognize there are many outcomes worse than the absence of life, and that we are in such a state right now. Despite our best efforts, which we should continue to deploy, I expect suffering will continue to rise as humans colonize other planets and torture more animals and eventually digital minds, etc. I’ll let you determine where that might lead philosophically if one could press a button, but I’m more concerned in practice, in reality, what to do about it. My vote is the EA community focus on making humanity less immoral, slow space colonization, focus much less on x-risks and more on s-risks, stop fueling utopians, etc. Hope that clarifies!
I mean, the good news (from your point of view) is that Mars colonization is going to happen pretty dang slowly. Even establishing a tiny base like the ISS or the moonbase in For All Mankind is probably going to take decades. (Elon’s timelines are always wildly optimistic, and always getting pushed back…)
The only things I can see that would make Mars colonization go fast would be things that have a disruptive or transformative impact on Earth, such as superhuman AGI.
stop fueling utopians
Don‘t feel any pressure to reply, but if you feel like it, I’m curious to know what kind of utopians or utopianism you think is dangerous.
It’s hard for me to make sense of whether AGI will be good to bad. I like the idea of it accelerating cellular agriculture; I hate the idea of it fueling space colonization. I could make a long list going back and forth.
Would you annihilate all life on Earth, if somehow you could? Are you just anti-Mars or also anti-Earth? Your argument seems to be that life itself is bad, so I’m trying to see how far that goes and where it takes you.
I’m a negative leaning utilitarian but not a negative utilitarian—I think happiness matters and that a utopia is at least marginally better than the absence of life. But I also recognize there are many outcomes worse than the absence of life, and that we are in such a state right now. Despite our best efforts, which we should continue to deploy, I expect suffering will continue to rise as humans colonize other planets and torture more animals and eventually digital minds, etc. I’ll let you determine where that might lead philosophically if one could press a button, but I’m more concerned in practice, in reality, what to do about it. My vote is the EA community focus on making humanity less immoral, slow space colonization, focus much less on x-risks and more on s-risks, stop fueling utopians, etc. Hope that clarifies!
I mean, the good news (from your point of view) is that Mars colonization is going to happen pretty dang slowly. Even establishing a tiny base like the ISS or the moonbase in For All Mankind is probably going to take decades. (Elon’s timelines are always wildly optimistic, and always getting pushed back…)
The only things I can see that would make Mars colonization go fast would be things that have a disruptive or transformative impact on Earth, such as superhuman AGI.
Don‘t feel any pressure to reply, but if you feel like it, I’m curious to know what kind of utopians or utopianism you think is dangerous.
I agree that’s good news!
It’s hard for me to make sense of whether AGI will be good to bad. I like the idea of it accelerating cellular agriculture; I hate the idea of it fueling space colonization. I could make a long list going back and forth.
Here’s an example. I don’t think this tone is helpful (though well intentioned and beautifully written): https://whatweowethefuture.com/afterwards/
I skimmed the sci-fi short story. What do you think is unhelpful?
Optimistic tone/utopian scene fuels the idea space colonization/expansion of humanity is a good idea.