Like I’ve said in many other comments, I don’t have a problem with their ranking or the fact that there is a ranking in the first place. And of course they are explicit about their values. But I still think there are ways to push x-risk as the top priority whilst also conveying other cause areas as more valuable than they currently are. Difficult of course, but not impossible. The key problem is that I’m not sure many people discouragd from “less important causes” then happily go into longtermism. I think it’s more likely they stop being active altogether (this is my personal impression of course from my own experiences and many conversations). Because you can’t force yourself to care about something when you simply don’t—even if you want to and even if that’d be the “best” and most rational thing to do. So people in “less important causes” might be lost altogether and not doing their “less important” but still pretty valuable (I think) work anymore. And that I the concern I wanted to voice. Not all that “absurd”, I think.
Mack the Knife
Dear Benjamin, thank you so much for taking the time to write this thorough response. That’s certainly more than I ever expected. I hope you don’t feel like I meant to attack you personally for picking out copy you wrote—this was certainly not my intention and merely a coincidence.
I can only imagine how difficult it is for 80k to navigate all the different stakeholders and their opinions. And like I’ve said in many comments, I definitely think 80,000 Hours should pursue what they deem most important and right.
However, I still wanted to raise this question, as I could really feel myself getting demotivated—it didn’t happen abruptly, but gradually with every piece of messaging I perceived to be devaluating the values I hold and the work I do. Of course I got biased over time. But then again, I know people who feel the same or similar as me, and some people here on the forum do as well, apparently.
I think the key issue might be that 80k ranks cause areas in a “rational” way in terms of their possible impact and neglectedness—but as a human, I think it’s natural to perceive this rather as a ranking of values (which in some sense it is), and of course having your personal values ranked “at the bottom” doesn’t exactly feel nice… Especially since I guess for many people, the decision to work in a certain cause area is probably mostly based on personal interests and less on objective considerations. There are many exceptions, surely, but I think for many people “choosing” animal welfare over longtermism isn’t so much an active choice, but rather a subconscious inclination that’s already set up long before you ever start to think about what you want to do. And when you’re then reading, that the thing you “chose” based on your intrinsic motivations isn’t “all that important” … well that’s where the demoralisation kicks in. 80k never puts it that drastically, of course, quite the opposite—but we’re talking about deep seated values here, the very core of what we are. So it’s probably natural to be quite defensive of them.
So for me, the whole thing is partly about how these messages might influence future decisions on career choice, but also strongly about how they make people feel about the choices they’ve already made and the values they currently hold—which they probably often don’t have all that much control over, like I said. It’s quite frustrating to think “Well, I’d really like to care deeply about all this, but I just don’t and there’s nothing I can do to change that, since I’m not a 100% rational being”.
It’s certainly not my place to give you advice on how to do your job and of course you have way more insight and experience in these trade-offs—but I feel sometimes wordings could be altered slightly to have less of a “rebuffing” effect whilst still having longtermism as the top cause area. At the same time I promise to try and actively perceive how you’re highlighting other cause areas instead of constantly nitpicking over exapmles where you don’t.
Like I said, I accept the fact that they are ranking and the order they’ve come up according to their values and estimations. I agree with triage—why would I be involved in EA if I didn’t? It’s not the “what” they communicate I have an issue with, it’s the “how”. I’ll admit that pushing x-risk as the number 1 top priority whilst also making other causes sound important is very difficult—of course it is. But that doesn’t mean it’s not worth talking or thinking about (and I’m sure 80k hours is thinking loads about this).
… I think you misunderstand what we are doing here.
I’ll be honest: I think that’s quite a condescending and rebuffing thing to say (even when phrased as a conditional phrase). You don’t know what I do or don’t understand, and I’m not quite sure who this “we” you’re talking about is supposed to be.
I see what you are saying. However, I’m not disputing 80k’s view that x-risks are the biggest and most important issue (even though I might personally disagree). I’m simply wary about the way the present other cause areas in comparison to that. Because while you might consider them less important, I think you’d agree they are still pretty important in their own right? Plus, like I’ve mentioned in other posts: the many, many people working in so called “less important” areas could still be multipliers that cause other people to become aware of EA and 80k, and they might then start working on x-risks etc.
Oh, I agree, they should be explicit about their opinions. But like you mentioned, they are highly influential and so I’m wary about the effects of their messaging on the many people not focused on longtermism.
What exactly is the change you’d like to see? Stopping saying that working on x-risk/longtermism is important, or still saying that but nevertheless being more encouraging about other areas?
The latter. I very much believe they should do what they think right. Nevertheless, I feel like some changes to their wording and the (visual) representation of other cause areas might be a good idea, especially given their huge influence overall, not just in longtermism. Would be a tough balance to be sure, but not impossible imo.
Yes, I think this might be true. For me (at least in the beginning of my time in EA), 80,000 Hours and EA were practically the same thing. So yes, maybe that distinction could be made clearer. But like I’ve said in the reply to Chris Leong: 80k has massive influence and I’m just worried they might be “rebuffing” a large amount of engaged EAs. As you mentioned, alternative orgs are much less influential.
I see what you are saying, but given the huge impact 80k hours has, I feel a bit uneasy about this. What if those many, many “people working on less impactful areas” tell their friends etc. about EA and 80k, and some of them get into x-risk, AI safety and so on? I wouldn’t unterestimate the community building aspect here, and don’t think potentially “rebuffing” a large amount of active EAs would be a net positive.
Neither, although probably closer to B. Of course they are entitled to their opinions and should feel free to express them. I just wish they would do it in a way that didn’t regularly come across as deminishing important efforts in other areas (just my opinion, naturally). Of course that isn’t easy and others have commented on the difficulty to balance the two. But I think, there are some things that could be done, especially in the wording and visual representation of different cause areas.
Thank you, David, I agree. I think many people are trying to be open-minded, but it’s always difficult to comprehend other people’s values, I suppose...
Good idea, thank you!
Yes, I think this might happen (upvote if you agree)
No, I don’t think that’s a relevant risk (Upvote if you agree)
Whoa okay, that’s a bit of an extreme statement—EA is incredibly broad and obviously I care about certaine cause areas that are deemed valid by the broader EA community—just not by 80k, apparently . But, as other commenters have pointed out, 80k doesn’t equal EA. Sure, longtermism plays a big part in the rest of EA (as I mentioned in my post) but it’s not EA’s top priority, as far as I know. Unlike 80k, I don’t think EA has a “top priority” because that would imply that the whole movement agrees on it which I don’t think is very likely going to happen. So it’s a little offending for you to suggest I’m not “suitable” for EA—when in fact I’m doing what the community always encourages you to do when you have an idea or feedback: share it.