While existential risks are widely acknowledged as an important cause area, some EA’s like William MacAskill have argued that “Trajectory Change” may be highly contingent even if x-risk is solved and so may be just as important for the long-term future. I would like to see this debated as a cause area
While existential risks are widely acknowledged as an important cause area, some EA’s like William MacAskill have argued that “Trajectory Change” may be highly contingent even if x-risk is solved and so may be just as important for the long-term future. I would like to see this debated as a cause area