LW server reports: not allowed.
This probably means the post has been deleted or moved back to the author's drafts.
I think it’s absurd to say that it’s inappropriate for EAs to comment on their opinions on the relative altruistic impact of different actions one might take. Figuring out the relative altruistic impact of different actions is arguably the whole point of EA; it’s hard to think of something that’s more obviously on topic.
Agreed. My sense is that much of the discomfort comes from the tendancy that people have to want to have their career paths validated by a central authority. But that isn’t the point of 80k. The point of 80k is to direct people towards whatever they think is most impactful. Currently that appears to be mostly x-risk.
If you meet some of the people at places like 80k and so forth, I think it’s easier to realize that they are just people who have opinions and failings like anyone else. They put a lot of work into making career advising materials, and they might put out materials that say that what you are doing is “suboptimal.” If they are right and what you’re doing really is clearly suboptimal, then maybe you should feel bad (or not; depends on how much you want to feel bad about not maximizing your altruistic impact) . But maybe 80k is wrong! If so, you shouldn’t feel bad just because some people who happen to work at 80k made the wrong recommendation.
Like I’ve said in many other comments, I don’t have a problem with their ranking or the fact that there is a ranking in the first place. And of course they are explicit about their values. But I still think there are ways to push x-risk as the top priority whilst also conveying other cause areas as more valuable than they currently are. Difficult of course, but not impossible. The key problem is that I’m not sure many people discouragd from “less important causes” then happily go into longtermism. I think it’s more likely they stop being active altogether (this is my personal impression of course from my own experiences and many conversations). Because you can’t force yourself to care about something when you simply don’t—even if you want to and even if that’d be the “best” and most rational thing to do. So people in “less important causes” might be lost altogether and not doing their “less important” but still pretty valuable (I think) work anymore. And that I the concern I wanted to voice. Not all that “absurd”, I think.
I dislike this reasoning because it feels deceptive? Like I don’t think we should push global health and well-being jobs to make people more aware of EA and 80k. We should communicate the correct information about them and let people choose while letting them know the full range of trade-offs.
As above, in response to Chris, you kind of town and castle (I’m explicitly trying to move away from motte and bailey because I can never remember which is which) to being less explicit on cause prioritisation means more people working on x-risk causes etc. I don’t think this is something EA should do on principle.
Yes, I think this might happen (upvote if you agree)
No, I don’t think that’s a relevant risk (Upvote if you agree)
Do you think (A) 80k’s opinions are wrong or (B) that they shouldn’t present their existing opinions so explicitly? (or something else?)
Neither, although probably closer to B. Of course they are entitled to their opinions and should feel free to express them. I just wish they would do it in a way that didn’t regularly come across as deminishing important efforts in other areas (just my opinion, naturally). Of course that isn’t easy and others have commented on the difficulty to balance the two. But I think, there are some things that could be done, especially in the wording and visual representation of different cause areas.