[Question] Are there cause priortizations estimates for s-risks supporters?

80000 hours has lots of concrete guides of each cause areas for us to work in, and there’s even an estimate the importance of each problem(though they claim it’s not very accurate) as below. But they uses x-risks angle to estimate. Like the”Scale” number is determined by the people we save(DALYs), so AI safety got the highest score because its extinction risk is the highest. But in s-risks, the main point should be “the suffering we reduce”. There are also lots of areas in s-risks, are there any cause priortization researches on s-risks?

No comments.