I’m sceptical about “ranking” something so complex and personal as career paths any way, but can understand why 80,000 Hours is doing it. However, isn’t it also possible to always be outright and unambiguously appreciative of the tremendous amount of good these “other” areas achieve for people, animals and planet?
I’m pretty bullish on ranking things generally as long as you are making clear your assumptions. If you think that people being beings being alive is better than them being dead and if you think these risks are tractable, it seems pretty trivial that existential risks are really high priority.
I like that people are working on current problems and I’d be sad if they didn’t but on balance while risks are what I think they are I think they are the biggest problems.
I see what you are saying. However, I’m not disputing 80k’s view that x-risks are the biggest and most important issue (even though I might personally disagree). I’m simply wary about the way the present other cause areas in comparison to that. Because while you might consider them less important, I think you’d agree they are still pretty important in their own right? Plus, like I’ve mentioned in other posts: the many, many people working in so called “less important” areas could still be multipliers that cause other people to become aware of EA and 80k, and they might then start working on x-risks etc.
I agree. And half of me thinks it’s a fair take and half of me says “tough”. The world is full of problems and we are trying to triage them. Disagree with the order if you like, but if you agree with the order but think “this woman is dying so she’s higher priority than this man” is a the wrong frame I think you misunderstand what we are doing here.
Like I said, I accept the fact that they are ranking and the order they’ve come up according to their values and estimations. I agree with triage—why would I be involved in EA if I didn’t? It’s not the “what” they communicate I have an issue with, it’s the “how”. I’ll admit that pushing x-risk as the number 1 top priority whilst also making other causes sound important is very difficult—of course it is. But that doesn’t mean it’s not worth talking or thinking about (and I’m sure 80k hours is thinking loads about this).
… I think you misunderstand what we are doing here.
I’ll be honest: I think that’s quite a condescending and rebuffing thing to say (even when phrased as a conditional phrase). You don’t know what I do or don’t understand, and I’m not quite sure who this “we” you’re talking about is supposed to be.
I’m pretty bullish on ranking things generally as long as you are making clear your assumptions. If you think that people being beings being alive is better than them being dead and if you think these risks are tractable, it seems pretty trivial that existential risks are really high priority.
I like that people are working on current problems and I’d be sad if they didn’t but on balance while risks are what I think they are I think they are the biggest problems.
I see what you are saying. However, I’m not disputing 80k’s view that x-risks are the biggest and most important issue (even though I might personally disagree). I’m simply wary about the way the present other cause areas in comparison to that. Because while you might consider them less important, I think you’d agree they are still pretty important in their own right? Plus, like I’ve mentioned in other posts: the many, many people working in so called “less important” areas could still be multipliers that cause other people to become aware of EA and 80k, and they might then start working on x-risks etc.
I agree. And half of me thinks it’s a fair take and half of me says “tough”. The world is full of problems and we are trying to triage them. Disagree with the order if you like, but if you agree with the order but think “this woman is dying so she’s higher priority than this man” is a the wrong frame I think you misunderstand what we are doing here.
Like I said, I accept the fact that they are ranking and the order they’ve come up according to their values and estimations. I agree with triage—why would I be involved in EA if I didn’t? It’s not the “what” they communicate I have an issue with, it’s the “how”. I’ll admit that pushing x-risk as the number 1 top priority whilst also making other causes sound important is very difficult—of course it is. But that doesn’t mean it’s not worth talking or thinking about (and I’m sure 80k hours is thinking loads about this).
I’ll be honest: I think that’s quite a condescending and rebuffing thing to say (even when phrased as a conditional phrase). You don’t know what I do or don’t understand, and I’m not quite sure who this “we” you’re talking about is supposed to be.