In a lot of crazy train frameworks, the existence of people is net negative, so a large future for humanity is the worst thing that could happen.
Curious to know why you think these frameworks are crazier than the frameworks that say it’s net positive.Or are you saying it’s too crazy in both cases and that we should reduce extinction risks (or at least not increase them) for non-longtermist reasons?
Curious to know why you think these frameworks are crazier than the frameworks that say it’s net positive.
Or are you saying it’s too crazy in both cases and that we should reduce extinction risks (or at least not increase them) for non-longtermist reasons?