Thoughts on 80,000 Hours’ research that might help with job-search frustrations
I intend to start working at 80,000 Hours this September, and in the meantime they’re contracting me to write some articles about careers and doing good, including this one. Nonetheless, this article represents my personal opinions only, and does not necessarily reflect the views of the 80,000 Hours team.
An EA Forum post from earlier this year demonstrated the difficulty of getting a job in effective altruism organisations, the frustration many people feel as a result, and the sense that other ways of doing good are not as highly valued by the EA community as they should be. People in the community have published a number of thoughtful responses.
I am currently doing some part-time contract work for 80,000 Hours, and I plan to start working there full-time in the fall. As such, I’ve been thinking a lot about 80,000 Hours’ research. And I wanted to add to the discussion a few ways I think that 80,000 Hours content might have inadvertently contributed to these problems, as well as some ideas for how people can get more out of their advice.
The main takeaways are:
Roles outside explicitly EA organizations are most people’s best career options.
Sometimes these roles aren’t as visible to the community, including to 80,000 Hours, but that doesn’t mean they aren’t highly impactful.
Many especially impactful roles require specific skills. If none of these roles are currently a great fit for you, but one could be if you developed the right skills, it can be worth it to take substantial time to do so.
You should use 80,000 Hours to figure out what your best career is and how to get there, not what “the” best careers are.
I haven’t seen people talk that much about the last point, so I spend the most time on it.
Over-representation of EA organizations in 80,000 Hours content
Given unlimited resources, 80,000 Hours could catalog every job opportunity that might be someone’s best option, and then direct that person toward it. But 80,000 Hours is a small team, and has only been around for 7 years. Because of this, their ideas and recommendations should be treated as tentative and growing over time.
Not only that, they are growing outward from the knowledge most central to EA. This means that 80,000 Hours is less likely to know about, and thus less likely to recommend, opportunities that are less familiar within the EA community.
It will be rare for an opportunity at an EA organization to escape their notice. But many great jobs in the wider world never come to 80,000 Hours’ attention, and when they do, there may be no time to look into them.
Thus, opportunities at EA organizations are more likely to be featured—in write-ups, in the job-board, in coaching advice—than opportunities at unaffiliated organizations that are less familiar to the EA community. And this will be the case even if the roles at the less familiar organizations have higher potential impact.
The contrast between career paths that 80,000 Hours explicitly recommends and those it doesn’t often reflects differences in those paths’ effectiveness, but sometimes it just reflects differences in how much they’ve been vetted. Just as GiveWell’s recommendations might be missing an effective nonprofit because they haven’t yet looked into it, so might the 80,000 Hours job board be missing many promising roles for high-impact work. And this is more likely when the role or the problem it addresses is less familiar to the EA community, and so less likely to be researched by the 80,000 Hours team.
Talk of talent gaps
One thing the original Forum poster emphasized is that because they had heard there were “talent gaps” in the EA community, they thought getting a job at an EA organization would be relatively easy so long as they were generally capable.
80,000 Hours tried to address this issue in a November article, Think Twice Before Talking About Talent Gaps: Clarifying Nine Misconceptions. Ben Todd says that talk of “talent gaps” at 80,000 Hours and elsewhere might have inadvertently contributed to some misunderstandings. Here are three points from Ben’s post that help explain why the existence of a “talent gap” doesn’t mean that working at an EA organization is the right fit for most smart and capable people who are enthusiastic about effective altruism:
The existence of “talent gaps” doesn’t really mean that there is a lack of talent available. The term refers instead to a lack of specific skills needed in the community. Hence, a better term is “skill bottlenecks.”
Skill bottlenecks in the community don’t mean that it will be easy to get hired at an EA organization. The reason is the point above: one can be hugely talented in various ways, but not have the specific skills needed.
Important skill bottlenecks also exist in policy, academia, and other fields without explicit EA organisations. These are very important and often require specific skilling up outside EA.
These clarifications were made only after the confusion over talent gaps became widespread, and Ben’s post didn’t help as much as 80,000 Hours might have hoped. The full post is worth checking out, both for justification of these positions and other points about how ‘skill bottlenecks’ seem to work in the EA community.
Less focus on earning to give
One reason 80,000 Hours started talking about “talent gaps” in the first place was to combat the misconception that it, or EA in general, are just about earning to give, and to highlight the need for people to do direct work as well. 80,000 Hours has been aggressive in its efforts here because many people outside the core EA community still perceive EA as primarily about, or even synonymous with, earning to give. And 80,000 Hours knows that its past emphasis on earning to give played a substantial role in making this the case.
80,000 Hours’ pushback seems to have failed to convey nuance. Some people—especially those who never had the preconception that EA is primarily about earning to give—have concluded that it thinks earning to give is a bad option for almost everyone, or that more funding is not useful for the EA community. However, that would be taking things too far.
80,000 Hours thinks earning to give is the best option for a substantial number of people—those for whom it’s their comparative advantage. They are keen, however, to make sure that people fully consider direct work options, instead of defaulting to earning to give because they’ve heard it is the best way to do good with one’s career.
This points to a general difficulty: because 80,000 Hours has a large and varied audience, with a wide range of preconceptions about what makes for an effective career, it is hard for it to communicate equally well with every kind of reader. In trying to disabuse one group of people of a tenacious misconception, 80,000 Hours probably inadvertently created another misconception among a different group of people.
Repeating “EA is not all about earning to give” over and over again has a different effect on people who never thought that was the case, or who follow 80,000 Hours closely enough to hear the message many times, than it does on people who did think that was the case and only read something about effective altruism once a year.
The focus on more senior roles
As the EA community has matured, 80,000 Hours has shifted its main focus toward filling as many mid-career (and sometimes even senior) roles as it can. This is because when the best possible candidate fills one of these roles, not only can they have an outsized direct impact, they can help other people have a greater impact too, through mentorship, management, better defining problems that others can then work on, and drawing experienced people into their field.
It used to be much harder to get members of the effective altruism community into more advanced positions, because the community was mostly young and inexperienced. At one point, most of 80,000 Hours’ readers were in college or had recently graduated. But that has changed slowly over time. 80,000 Hours’ audience is now a lot more diverse, including not only young people but also plenty in graduate school or the middle of impressive careers.
80,000 Hours has recently focused more on making material for the latter groups, partly because that content doesn’t exist elsewhere, and partly because filling senior roles is especially impactful. But this may have caused readers earlier in their careers to be discouraged by content that isn’t chiefly aimed at them. Again, the fact that the 80,000 Hours audience is diverse results in the same content being very helpful for some, but unhelpful or even counterproductive for others.
Advice about building career capital by starting at the bottom of an organization should have combated this issue, by showing people who are less advanced in their careers how they should work toward more senior roles. But this advice wasn’t always emphasized enough. And because of the misconceptions about “talent gaps,” especially within EA organizations, some might have gotten the impression that starting in a more entry-level role wasn’t going to be necessary.
Most readers who are still early in their careers must spend considerable time building targeted career capital before they can enter the roles 80,000 Hours promotes most. This might be frustrating, or make people feel like they’re “not doing enough.” But building career capital that’s relevant to where you want to be in 5 or 10 years is often exactly what you should be doing.
The “big list” view of doing the most good with your career (and why it doesn’t make sense)
One way to think about doing good with our careers is to picture a big list of all the career paths in the world anyone could pursue, ordered from most to least impactful. The higher our own career ranks, the more good we’re doing, and the better we should feel about ourselves, the more people should respect us, and so on.
This is a bit of a caricature, but it doesn’t seem that far from the way people sometimes think. I know I often think in this way. And it can sometimes feel like 80,000 Hours’ career advice is articulating a big list like this. Moreover, once we’re thinking in this way, it’s easy to feel like where our path ranks on such a list should have important consequences for our social status or self-esteem.
Still, I think this “big list” picture is wrong, for three reasons:
1. It imagines a single, unified ranking of career paths ordered from most to least impactful.
But the complication of interaction effects between different people’s actions makes me skeptical that there is such a unified ranking, even in theory.
The impacts of different people’s paths and actions often depend on one another, such as in cases of cooperation or trade. Some people’s actions might determine the possibilities for other people’s actions or make them more or less effective. These factors makes it very hard to meaningfully rank them.
For example, how would you rank actions by multiple people that are all necessary conditions for any of them to have an impact? Whose actions were more impactful, Norman Borlaug’s or Norman Borlaug’s mother’s?
2. For every reader, such a list would include many paths that they can’t take.
Most of the career paths on a big list of everyone’s possible paths will be irrelevant to you, because obviously your options are limited by your circumstances. More on this below.
3. The “big list” picture makes an unwarranted assumption about how we should feel about ourselves and others.
Many people in the EA community think that the best actions to take in any given situation are those with the most positive impact. But it doesn’t follow that the people we should hold in highest esteem, or who should be most proud, are those whose actions have the most positive impact.
I don’t claim to know exactly how we should think about the criteria for things like esteem and pride. But I don’t think these should be proportional only to how much good a person has done.
Different people have different options available to them. Maybe some career path would be really high-impact, but it’s impossible for someone to take, for whatever reason. From a utilitarian perspective, it clearly doesn’t make sense for them to feel bad about not taking that path, even if it would have done a lot of good had they been able to do so. Feeling bad won’t make them any more able to take the path, nor does it seem to make things overall better in other ways.
From a non-utilitarian perspective, it doesn’t seem like how much good I do fully determines how I should feel about myself either. I hope that in 100 years most people will be doing much more good than I can today because they have greater wisdom, new technology, and better coordination. Should this comparison with the big possible impacts of future generations make us feel worse about ourselves and one another? I don’t think so. How I should feel about myself and how others should feel about me are not just functions of how much good I do.
Despite these flaws, the “big list” picture seems like it might be playing a part in the frustration and disappointment that many people feel in the highly competitive EA job market. It has definitely played a part in my own experience.
Of course, getting rejections is never easy. But if you read 80,000 Hours as offering a “big list” of the best careers, and you feel like your status with yourself or your community is tied up with doing the top things on that list, that makes getting rejected from one of those top things feel even worse.
The “personal list” view
It seems to me that the way to think about doing the most good you can do with your career is to put the emphasis on the you. Each person has their own list of possible careers available to them, ranked from best to worst. What each person’s list looks like depends on:
The options available to them.
How other people will act given what they do.
Our ambition should be to do the best things on our personal lists.
Because each person’s list only includes actions available to them, and because the ranking is determined in part by what other people will do in response to each action, this approach doesn’t have the problems that the “big list” picture does.
But it’s really hard to figure out what is on your personal list, or how the different options compare to each other.
We should read 80,000 Hours as trying to help people figure out what their personal lists look like. Sometimes they do this by giving general advice about how to figure out what options are on your list and how they rank. This often involves tools for assessing your skills and interests, or for understanding how your actions could affect other people.
Sometimes 80,000 Hours tries to help people with their lists by promoting particular career paths that they think do belong high on the lists of some people who haven’t yet realised it. But if, after some investigation, you find some path they talk about isn’t on your personal list, that shouldn’t make you feel bad.
Someone might object: if we don’t make it feel bad to not do the most impactful things, how will we be motivated to do the most good?
But as I said above, it is impossible to motivate someone to do something that simply isn’t an option for them. At most, people should feel bad for not doing the top things on their personal lists, or not trying to figure out what those things are.
A different objection is that maybe what my list looks like is itself grounds for pride or esteem.
This has some intuitive appeal, but seems unjustified to me. It can’t be justified in terms of motivation, for the reason just given. And I don’t see any other available argument. Usually we think highly of people for doing the right action among a set of options available to them, or for following the reasons they have to think or act in a certain way. But I don’t know what would justify thinking highly of someone just for their list being what it is.
I actually think the EA social community is generally on board with these points. People don’t go around trying to make people feel bad because they aren’t as impactful as someone else, and we recognize that impactful work is often interdependent in complex ways. But it’s important to be vigilant because it’s pretty easy to fall into the “big list” way of thinking.
In sum: 80,000 Hours’ research does not and cannot yield a “big list” of the best career paths, because no such thing exists. Instead, we should use 80,000 Hours content to map out our own personal lists and figure out how to do the top things on them.
Thanks to Robert Wiblin and Howie Lempel for feedback on a draft of this post.
- Community vs Network by 12 Dec 2019 14:04 UTC; 130 points) (
- My Career Decision-Making Process by 21 Jan 2021 20:17 UTC; 102 points) (
- Plan Your Career on Paper by 23 Sep 2021 15:04 UTC; 74 points) (
- Latest EA Updates for April 2019 by 1 May 2019 15:08 UTC; 55 points) (
- EA Survey 2019 Series: Donation Data by 13 Feb 2020 21:58 UTC; 49 points) (
- EA Survey 2019 Series: Careers and Skills by 7 Jan 2020 21:13 UTC; 46 points) (
- Advice on how to read our advice by 15 Nov 2019 0:00 UTC; 36 points) (
- 16 Nov 2019 12:04 UTC; 10 points) 's comment on EA Handbook 3.0: What content should I include? by (
- 21 Nov 2019 11:01 UTC; 10 points) 's comment on I’m Buck Shlegeris, I do research and outreach at MIRI, AMA by (
- 26 Mar 2021 6:35 UTC; 4 points) 's comment on Propose and vote on potential EA Wiki entries by (
As someone who a) is an undergraduate, working toward a major that does not open many EA doors (English) and b) is not very familiar with the EA community, I found the 80,000 Hours guide helpful for the following reasons:
It opened me up to the idea that there are probably lots of satisfying careers out there for me. Previously I’d only considered careers in two very narrow fields related to my immediate interests and talents.
It did not make me want to pursue the most impactful job. It made me want to find my most impactful job. Advice like “Don’t do something you won’t enjoy in order to have more impact” was helpful in getting me to think this way.
It specifically brought operations management to my attention as a field where I might excel.
Points of criticism/overall not-great experiences:
Before I saw operations management, I felt lost. Believing that no priority path was open to me, I couldn’t determine what my actual best career path was; I just had examples of some other non-priority paths to consider.
The two(?) places in the guide with a disclaimer of “Actually this is misleading but we haven’t fixed it yet, please see [blog post]” were a little bizarre. Ultimately I felt compelled to read the guide post and then disregard it.
This might change as I dig more into 80k’s resources, but for now, I still don’t have a great sense of:
how to get a job doing operations management for an EA organization or EA cause area
how competitive these positions are
whether ops management will still be a skill bottleneck in 5-10 years
in the event that I build up the skills needed for ops management and then don’t end up in an EA ops job (because it’s not marginally important anymore, or because it’s too competitive), whether there will still be high-impact career paths available to me
Because of these uncertainties—especially the last two bullet points—I’m not sure whether orienting myself toward an EA operations management job is the best use of my current resources.
An interesting data point is that the current Director of Operations at Open Philanthropy, Beth Jones, was previously the Chief Operating Officer of the Hillary Clinton 2016 campaign.
On the other hand, the four operations associates most recently hired by OpenPhil have impressive but not overwhelmingly intimidating backgrounds. I’d like to know how many applied for those four positions.
I think this is super important, and the current guide does a good job of emphasizing it with the multiplicative effect of personal fit.
I think getting 5-10 year forecasts for different careers would be really helpful for many career paths, especially ancillary ones like operations which can be really important if they get bottlenecked.
This may be more difficult to estimate versus a research role, since its likely current EA orgs will expand in the near future and need more analysts, but its probably harder to say for other roles—although I think it’s super important. Also, since 80K focusing on mid-level and senior roles, and filling immediate vacancies they may not prioritize this.
These two sentences seem to be in a lot of tension. If giving advice about which careers did the most good were entirely personal, then it necessarily follows that you could make no general recommendations at all about which careers are better in terms of impact and therefore 80k should stop what they are doing. However, if you can make general recommendations and thus say which careers have more impact that others, then there is a ‘big list’ after all.
We might disagree about who this is a ‘big list’ for—the average person, an omni-skilled graduate of a top university, the average reader of 80k’s content—but however we fill that out, it’s still possible to see it as a ‘big list’.
I’m entirely with you that it doesn’t make sense to feel bad if someone else can do more good than you. The aim is to do the most good you can do, not the most good someone else who isn’t you can do. Despite recognising this on a conceptual level, I still find it hard to believe and often feel guilty (or shame or sadness) when I think of people whose ‘altruistic successfulness’ surpasses mine.
Hey Michael,
Thanks for commenting. With regard to your first point: I don’t think there is a tension—The idea of a list of the best careers for everyone from top to bottom doesn’t make much conceptual sense. But a list of career paths that it does the most good for a specific set of people to read about and consider does make conceptual sense. I think of 80,000 Hours as more like the latter.
And as I wrote in a reply to your other comment below, a list like that can be really helpful for people in creating their own personal lists of what the best options are for them.
(this is basically to agree with cole_haus’s reply)
It seems quite possible to me have a “parameterized list”. That is, recommendations can take the shape “If X is true of you, Y and Z are good options.” And in fact 80,000 Hours does do this to some degree (via, for example, their career quiz). While this isn’t entirely personalized (it’s based only on certain attributes that 80,000 Hours highlights), it’s also far from a single, definitive list. So it doesn’t seem to be that there’s any insoluble tension between taking account of individual difference and communicating the same message to a broad audience—you just have to rely on the audience to do some interpreting.
I don’t think the tension is between those things. The tension is between saying ‘our research is useful: it tells (group X) of people what it is best for them to do’ and ‘our research does not offering a definite ranking of what it is before for people to do (whether people in group X or otherwise)’. I don’t think you can have this both ways.
Then it seems reasonable to interpret it as (an attempt at) a definitive list if you have those attributes.
I understand why the author is arguing that 80k doesn’t offer a big list but I think that argument is undermines the claim that 80k is useful (“Hey, we’re not telling anyone what to do?” “Really? I thought that was the point”)
Though I am saying that 80,000 Hours’ research can’t offer a single, definite ranking of what is best for everyone to do, that doesn’t mean that their research isn’t very useful for people figuring out what it is best for them to do.
The way I might put it: 80,000 Hours research helps people put together their own list of what is best for them to do, by (1) offering lots of information people need to combine with their own knowledge about themselves to build their list—e.g., what certain jobs are like, how people typically get into a particular job, and so on, (2) offering tools for people to use to figure out the information about themselves that they need—like for assessing personal fit, etc., and (3) offering guidance on how to prioritize options according to the impact that people in the roles can have under various different circumstances. 80,000 hours also does things like seek out specific positions and bring them to people’s attention.
All this is really useful, I believe, for helping people do the most good they can with their careers, without any of it amounting to creating a big list of what it’s best for everyone in group x (e.g., the EA community) to do.
Well, they do offer A list of the most urgent global problems. I’ll grant this isn’t a list of what it is best for everyone to do, but it is (plausibly, from their perspective) a list of what it is best for most people to do (or ‘most EAs’ or some nearby specification). Indeed, given 80k has a concept of ‘personal fit’, which is distinct from their rating of the problems, the natural reading of the list is that it provides a general, impersonal ranking of where (average?) individuals can do the most good.
I’m concerned you’re defending a straw man - did anyone ever claim 80k’s list was true for every single possible person? I don’t think so and such a claim would be implausible.
As an anecdote, I’ve always read their list and recommendations as applying to their target audience of talented graduates of elite Western colleges.
Have they ever admitted to specifically targeting graduates of elite colleges rather than ambitious graduates generally?
To be clear, I don’t know whether they specifically target elite college graduates. I was speaking slightly loosely and don’t have any inside information on 80k. It just seems to me that use elite colleges are a proxy for ambitious graduates.
Yup. In which case, it is a ‘big list’ for such folks.
Last I checked, the career quiz recommends almost everyone (including everyone “early” and “middle” career, no matter their other responses) either “Policy-oriented [UK] government jobs” or “[US] Congressional staffer”, so it hardly seems very reflective of actually believing that the “list” is very different for different people.
fwiw I’ve never gotten those outcomes when I’ve taken the quiz.
I got them on basically every setting that remotely applied to me.
Hi lexande, Habryka, Milan — As you note, the quiz is no longer current content. It has been moved way down in the site structure, and carries this disclaimer:
Yep, I saw that. I didn’t actually intend to criticize your use of the quiz, sorry if it came across that way. I just gave it a try and figured I would contribute some data.
(This doesn’t mean I agree with how 80k communicates information. I haven’t kept up at all with 80k’s writing, so I don’t have any strong opinions either way here)
Yeah, my response was directed at cole_haus suggesting the quiz as an example of 80k currently providing personalized content, when in fact it’s pretty clearly deprecated, unmaintained, and no longer linked anywhere prominent within the site. (Though I’m not sure what purpose keeping it up at all serves at this point.)
Yeah, I hadn’t realized it was more or less deprecated. (The page itself doesn’t seem to give any indication of that. Edit: Ah, it does. I missed the second paragraph of the sidenote when I quickly scanned for some disclaimer.)
Also, apparently unfortunately, it’s the first sublink under the 80,000 Hours site on Google if you search for 80,000 Hours.
“The page itself doesn’t seem to give any indication of that.”
As I pointed out it says at the top:
We’ve been working to get it downgraded from the Google search results, but unfortunately we don’t have full control over that.
Why don’t you just take it down entirely? It’s already basically non-functional.
I like there being a record of out-of-date recommendations and tools on the 80K site [edit: so I know how they’ve updated, and so I can access the parts of old resources that aren’t out of date].
A curated list of Archive links might work OK as a replacement, I suppose. But in general, given that various pages have accumulated offsite hyperlinks over the years, I think it’s more informative to plaster giant “this content is out-of-of-date because X” disclaimers on the relevant pages, rather than just taking the page down.
That applies to most of the deprecated pages, but doesn’t apply to the quiz, because its results are based on the database of existing career reviews. The fact that it gives the same results for nearly everybody is the result of new reviews being added to that database since it was written/calibrated. It’s not actually possible to get it to show you the results it would have showed you back in 2016 the last time it was at all endorsed.
Yep, the quiz may be an exception! I was commenting on the general thread of discussion on this page “just take everything down that’s out of date,” and the quiz subthread was just the one that caught my eye. My apologies for making it sound like the quiz in particular is the thing I want preserved; I don’t have a strong view on that.
Perhaps the case for keeping the page up has something to do with the page being highly ranked on Google search...
Ah, I see that now. Thanks.
FWIW, I was specifically looking for a disclaimer and it didn’t quickly come to my attention. It looks like a few other people in these subthreads may have also missed the disclaimer.
It’s not as prominent as it should be. We’re going to fix that.
Ah, I hadn’t taken the quiz in a couple years. Looks like they’ve changed it since then.
I just tried 5 different answer-configurations of the quiz: https://80000hours.org/career-quiz/
And got “congressional staffer” or “policy-oriented government job” for all configurations. Guess I should move to DC.
Congressional staffer and policy oriented jobs are the top two highest weighted profiles of about 20 so everyone will automatically get these (followed by a ranked list of the remaining 15 options, but only the first profile is displayed, possibly a glitch). The biggest filter is quantitative—if you select no it cuts 15 profiles, then a cut of any profiles with weights of 0.
The weights are biased—it’s impossible to get arts or marketing as a final result (because it’s not a recommended job pathway within EA). The basic premise is if you are good at math and science—do these high-impact math and science jobs. If not, do any other non-quantitative high-impact job we recommend.
The results aren’t personal enough to have it be designed like this—a simple list of all 35 profiles with a quantitative skills filter and then ranking profiles by 80k’s weights would be sufficient enough. It’s really not well-suited for a quiz, but a filtered list like the job board might be more effective.
Note that the “policy-oriented government job” article is specific to the UK. Some of the arguments about impact may generalize but the civil service in the UK in general has more influence on policy than in the US or some other countries, and the more specific information about paths in etc doesn’t really generalize at all.
Great comment.
In my experience, feelings like this have flowed from not being clear on the motivations driving my actions. More on this here: Altruistic action is dispassionate
Getting clearer on the “what motivations are driving this?” thing has been really helpful (both for improving my subjective experience, and for boosting my efficacy).
1) If the way you talk about career capital here is indicative of 80k’s current thinking then it sounds like they’ve changed their position AGAIN, mostly reversing their position from 2018 (that you should focus on roles with high immediate impact or on acquiring very specific narrow career capital as quickly as possible) and returning to something more like their position from 2016 (that your impact will mostly come many years into your career so you should focus on building career capital to be in the best position then). It’s possible the new position is a synthesis somewhere between these two, since you do include the word “targeted”, but how well can people feasibly target narrow career capital 5-10 years out when the skill bottlenecks of the future will surely be different?
2) In general I’ve noticed a pattern (of which the above two linked posts are an example) where 80k posts something like “our posts stating that ‘A is true’ have inadvertently caused many people to believe that A is true, here’s why A is actually false” while leaving up the old posts that say ‘A is true’ (sometimes without even a note that they might be outdated). This is especially bad when the older ‘A is true’ content is linked conveniently from the front page while the more recent updates are buried in blog history. Is it feasible for 80k to be more aggressive in taking down pages they no longer endorse so they at least don’t continue to do damage, and so the rest of us can more easily keep track of what 80k actually currently believes?
3) Regarding the problem of a diverse audience with differing needs, an obvious strategy for dealing with this is to explicitly state (for each page or section of the site) who the intended audience is. I’ve found that 80k seems strangely reluctant to answer what level of human/social/career/financial capital they assume their audience has, even when asked directly.
I didn’t mean for what I said to suggest a departure from 80,000 Hours’ current position on career capital. They still think (and I agree) that it’s better for most people to have a specific plan for impact in mind when they’re building career capital, instead of just building ‘generic’ career capital (generally transferable skills), and that in the best case that means going straight into a career path. But of course sometimes that won’t be possible and people will have to skill up somewhere else.
This is a good question, and it’s of course not easy to predict what the most impactful things to do in 5-10 years will be—it seems unlikely, though, that working toward one of 80,000 Hours’ “priority paths” will become not very useful down the line. And in general being sensitive to how skill bottlenecks might change in the future is definitely something that 80,000 Hours is keen on.
To your second point: I mean, yeah—it’s hard to keep everything up to date, especially as the body of research grows, but it’s obviously bad to have old content up there that is misleading or confusing. Updating and flagging (and maybe removing—I’m not sure) old content is something 80,000 Hours is working on.
I’m not sure what exactly the 80,000 Hours team would say about explicitly labeling different pages with notes about the intended audience, but my guess is that they wouldn’t want to do that for a lot of their content because it’s very hard to say exactly who it will be useful for. They do have something about intended audience on their homepage: “Ultimately, we want to help everyone in the world have a big social impact in their career. Right now, we’re focusing on giving advice to talented and ambitious graduates in their twenties and thirties.” I know that’s vague, but it seems like it has to be vague to keep from screening off people who could benefit from the research.
Maybe they could do a better job of helping people figure out what content is for them and what content isn’t, but it doesn’t seem to me at least like explicitly labels at the top of pages would be the right way to go about it.
On career capital: I find it quite hard to square your comments that “Most readers who are still early in their careers must spend considerable time building targeted career capital before they can enter the roles 80,000 Hours promotes most” and “building career capital that’s relevant to where you want to be in 5 or 10 years is often exactly what you should be doing” with the comments from the annual report that “You can get good career capital in positions with high immediate impact (especially problem-area specific career capital), including most of those we recommend” and “Discount rates on aligned-talent are quite high in some of the priority paths, and seem to have increased, making career capital less valuable.” Reading the annual report definitely gives me the impression that 80k absolutely does not endorse spending 5-10 years in low-impact roles to try to build career capital for most people, and so if this is incorrect then it seems like further clarification of 80k’s views on career capital on the website should be a high priority.
On planning: While I expect 80k’s current priority paths will probably all still be considered important in 5-10 years time, it’s harder to predict whether they will still be considered neglected. It’s easy to imagine some of the fields in question becoming very crowded with qualified candidates, such that people who start working towards them now will have extreme difficulty getting hired in a target role 5-10 years from now, and will have low counterfactual impact if they do get hired. (It’s also possible, though less likely, that estimates of the tractability of some of the priorities will decline.)
On outdated content: I appreciate 80k’s efforts to tag content that is no longer endorsed, but there have often been long delays between new contradictory content being posted and old posts being tagged (even when the new post links to the old post, so it’s not like it would have required extra effort to find posts needing tagging). Further, posts about the new position sometimes fail to engage with the arguments for the old position. And in many cases I’m not sure what purpose is served by leaving the old posts up at all. (It’s not like taking them down would be hiding anything, they’d still be on archive.org.)
On article targeting: In your original post you gave the example of 80k deliberately working to create more content targeted at people later in their careers, and this winding up discouraging some readers who are still early in their careers. Surely at least in that case you could have been explicit about the different audience you were deliberately targeting? More generally, you express concern about “screening off people who could benefit from the research”, but while such false negatives are bad, failing to screen off people for whom your advice would be useless or harmful is also bad, and I think 80k currently errs significantly in the latter direction. I also find it worrying if not even 80k’s authors know who their advice is written for, since knowing your target audience is a foundational requirement for communication of any kind and especially important when communicating advice if you want it to be useful and not counterproductive.
Do you have examples of this?
The posts I linked on whether it’s worth pursuing flexible long-term career capital (yes says the Career Guide page, no says a section buried in an Annual Report, though they finally added a note/link from the yes page to the no page a year later when I pointed it out to them) are one example.
The “clarifying talent gaps” blog post largely contradicts an earlier post still linked from the “Research > Overview (start here)” page expressing concerns about an impending shortage of direct workers in general, as well as Key Articles suggesting that “almost all graduates” should seek jobs in research, policy or EA orgs (with earning-to-give only as a last resort) regardless of their specific skills. The latter in turn contradict pages still in the Career Guide and other posts emphasizing earning-to-give as potentially superior to even such high-impact careers as nonprofit CEO or vaccine research.
Earlier they changed their minds on replaceability (before, after); the deprecated view there is no longer prominently linked anywhere but I’m unsure of the wisdom of leaving it up at all.
Given how much 80k’s views have changed over the past 5-10 years, it’s hard to be optimistic about the prospects for successfully building narrow career capital targeted to the skill bottlenecks of 5-10 years from now!
Hi lexande —
Re point 1, as you say the career capital career guide article now has the disclaimer about how our views have changed at the top. We’re working on a site redesign that will make the career guide significantly less prominent, which will help address the fact that it was written in 2016 and is showing its age. We also have an entirely new summary article on career capital in the works—unfortunately this has taken a lot longer to complete than we would like, contributing to the current unfortunate situation.
Re point 2, the “clarifying talent gaps” post and “why focus on talent gaps” article do offer different views as they were published three years apart. We’ve now added a disclaimer linking to the new one.
The “Which jobs help people the most?” career guide piece, taken as a whole, isn’t more positive about earning to give than the other three options it highlights (research, policy and direct work).
I think your characterisation of the process we suggest in the ‘highest impact careers’ article could give readers the wrong impression. Here’s a broader quote:
You say that that article ‘largely contradicts’ the ‘clarifying talent gaps’ post. I agree there’s a shift in emphasis, as the purpose of the second is to make it clearer, among other things, how many people will find it hard to get into a priority path quickly. But ‘largely contradicts’ is an exaggeration in my opinion.
Re point 3, the replaceability blog post from 2012 you link to as contradicting our current position opens with “This post is out-of-date and no longer reflects our views. Read more.”
Our views will continue to evolve as we learn more, just as they have over the last seven years, though more gradually over time. People should take this into account when following our advice and make shifts more gradually and cautiously than if our recommendations were already perfect and fixed forever.
Updating the site is something we’ve been working on, but going back to review old pages trades off directly with writing up our current views and producing content about our priority paths, something that readers also want us to do.
One can make a case for entirely taking down old posts that no longer reflect our views, but for now I’d prefer to continue adding disclaimers at the top linking to our updated views on a question.
If you find other old pages that no longer reflect our views and lack such disclaimers, it would be great if you could email those pages to me directly so that I can add them.
Rob says:
Just to be clear, I added the disclaimer to that page today after lexande wrote their initial comment. I don’t think Rob realised that the disclaimer was new.
[Rob’s now edited his post to make that clear.]
Hey Lexande-
Just to address your last point/question: I don’t think that the right thing to take away from 80,000 Hours changing its mind over the years on some of these points is pessimism about the targeted career capital one builds now being useful in 5-10 years—there are a lot of ways to do good, and these changes reflect 8,000 Hours changing views on what the absolute optimal way of doing good is. That’s obviously a hard thing to figure out, and it obviously changes over time. But most of their advice seems pretty robust to me. Even if it looks like in 5-10 years it would have been absolutely optimal to be doing something somewhat different, having followed 80,000 Hours advice would still likely put you in a position that is pretty close to the best place to be.
For example, if you are working toward doing governmental AI policy, and in 5-10 years that area is more saturated and so slightly less optimal than they think now it will be, and now it’s better to be working in an independent think tank, or on other technology policy, etc., then (1) what you’re doing is probably still pretty close to optimal, and (2) you might be able to switch over because the direct work you’ve been engaging in has also resulted in useful career capital.
It’s also important to remember that if in 10 years some 80,000 Hours-recommended career path, such as AI policy, is less neglected than it used to be, that is a good thing, and doesn’t undermine people having worked toward it—it’s less neglected in this case because more people worked toward it.
The specific alternatives will vary depending on the path in question and hard to predict things about the future. But if someone spends 5-10 years building career capital to get an operations job at an EA org, and then it turns out that field is extremely crowded with the vast majority of applicants unable to get such jobs, their alternatives may be limited to operations jobs at ineffective charities or random businesses, which may leave them much worse off (both personally and in terms of impact) than if they’d never encountered advice to go into operations (and had instead followed one of the more common career path for ambitious graduates, and been able to donate more as a result).
I’m also concerned about broader changes in how we think about priority paths over the coming 5-10 years. A few years ago, 80k strongly recommended going into management consulting, or trying to found a tech startup. Somebody who made multi-year plans and sacrifices based on that advice would find today that 80k now considers what they did to have been of little value.
80,000 Hours has a responsibility to the people who put their trust in it when making their most important life decisions, to do everything it reasonably can to ensure that its advice does not make them worse off, even if betraying their trust would (considered narrowly/naively) lead to an increase in global utility. Comments like the above, as well as the negligence in posting warnings on outdated/unendorsed pages until months or years later, comments elsewhere in the thread worrying about screening off people who 80k’s advice could help while ignoring the importance of screening off those who it would hurt, and the lack of attention to backup plans, all give me the impression that 80k doesn’t really care about the outcomes of the individual people who trust it, and certainly doesn’t take its responsibility towards them as seriously as it should. Is this true? Do I need to warn people I care about to avoid relying on 80k for advice and read its pages only with caution and suspicion?
Hi lexande—thanks for taking the time to share your worries with us. We take our responsibility towards our users seriously.
I don’t think we’re likely to come to agreement right now on a lot of the other specific issues that have been raised.
That said, it’s helpful to know when our users strongly disagree with our priorities and we take that into account when we form our plans.
Speaking from experience running workshops based on 80K for college students (1st and 2nd years) we always emphasise the personal list view and balance it with more general 80K career/cause profiles in the following ways:
1) Exploring unfamiliar careers and career paths 80K covers some unusual careers that are worth spending time on, both on cause areas and job types (like the High Impact Management profile). This helps expand peoples’ options, especially early in their college career.
2) Build flexible career capital Unless someone is sure of their interests we suggest building flexible career capital and experimenting with different experiences/internships before committing to one career path. Edit: We also try to emphasize the flexibility of different paths, and how each choice closes off or opens different paths, so people are making the best choices based on their interests.
3) Personal fit & Comparative advantage I’ve been surprised at how quickly people evaluate their own personal fit, and encourage them to use 1) to consider jobs that they might be good at but unfamiliar with. We also emphasize comparative advantage to further personalize this.
4) Critiquing 80K’s claims We try to engage students to critically think about the claims we (and 80K) make and push back against them. We have had some really good discussions about moral frameworks and the article What skills make you most employable?
In general, I think the first step to 80K is really to develop this robust framework first with the advice that is generally true for all careers, get exposure to different career paths and come to your own list before looking at the annual reports and more recent recommendations.
I also think that 80K should create resources for local group leaders, many of whom engage in one-on-one career advice, and that more effort should be made to track EAs pursuing different careers.
These days 80k explicitly advises against trying to build flexible career capital (though I think they’re probably wrong about this).
I would disagree with this too, especially for advice given to college students (the students in our course are mostly first- and second-years, added to my original comment for clarity). We recommend they try out different paths because often their go-to career options may not be particularly high-impact. Our advice is targeted mostly for the 4 years of College. But I think it’s generally good advice for most people to know, because of your own personal interests and inclinations.
I also noticed in the link you provide that 80K suggests going straight to graduate school, which I would also not recommend unless you are certain of what you want to do (and because it can be a substantial financial burden, especially in the US).
One of 80K’s strongest features was (since they seem to be moving in a different direction) giving good generic career advice, especially for undergraduates. It would be a shame to lose this because I think it makes a great initial impression to newcomers and convinces them straight off the bat of how useful EA can be in helping them make meaningful impact, even if they aren’t convinced by all of the ideas behind EA immediately.
That sounds plausible to me if the same recommendations apply to newcomers and to die-hard EAs, such that “do we give advice that’s useful for general audiences?” is just a question of “which good-to-follow advice do we emphasize?” and not “which advice is good for a given demographic to follow?”.
On the other hand, I don’t want 80K to give advice that’s actively bad for die-hard EAs to follow, no matter how useful that advice is to students in general. From my perspective (which might reflect a different set of goals than yours, since I’m not coming at this question from your position), that would make it too hard to just zip over to 80K’s website for advice and trust that I’m getting relevant information.
I don’t think we should underestimate the value of being able to trust that an information source is giving us exactly what it thinks the very best advice is, without having to worry about how much the source might be diluting that advice to make it more memetic or easy-to-sell. Being able to just take statements on the 80K website at face value is a big deal.
If a certain piece of advice turns out to be good for most students but bad for most EA students, then I could see it being possibly interesting and useful for 80K to make a page like “Here’s how our advice to most students would differ from our advice to EA students.” That could then serve a dual purpose by clarifying what sensible “baseline” advice looks like. I think it would also be fine for 80K to link to some offsite, non-80K-branded career advice that they especially endorse for other students, even though they specifically don’t endorse it for maximizing your career’s altruistic impact.
I think this is a good idea. I personally don’t think general advice (that I’ve been referring to about personal fit and flexible career capital) would actively harm individual EAs personally (it might, but I doubt it) as a general framework. I also don’t think it would harm the community in the long term either, because we don’t want people to be demoralized or burn out. But, what you suggest might alleviate some of these concerns.
An alternative is to have clear paramterized if X then Y lists, like cole_haus suggests above would solve this issue of not getting the best advice. That way, there is not dilution, simply targeting different audiences. Any kind of mass-outreach has the problem that not everything will apply to everyone.
My biggest concern with what you suggest is that 80K as a major first point-of-contact for new EAs. According to the most recent EA Survey, 25% of new EAs in 2018 first heard of EA through 80K, way up from previous years of 5%. For the reasons I gave above, I think giving general (but still impact-related) advice is going to be really important for people to continue engaging with the community. It also probably won’t help the diversity issue (in professional expertise) with EA (although it seems like that’s fairly low-priority across the board). So, hardcore EA advice might be too much for newcomers vs. the more general “ease into the EA mindset” approach of the original 80K guide, which is still EA branded in some way so maintains engagement with the community.
Yeah, I don’t have a strong object-level view about exactly which advice is best for most EAs; I just wanted to voice some support for letting those recommendations drift apart if it does end up looking like EAs and non-EAs benefit from different things. I think “if X then Y” can definitely be a good solution.
This seems great to me! Thanks for writing this out. As for building flexible career capital (re: the comment below): flexibility is of course good all else equal, and more important the earlier people are in their careers. It’s just that people can face a trade-off at some point between flexibility and usefulness to something specific. I think 80,000 Hours has changed its views on how to weight the considerations in that trade-off, favoring usefulness to something specific more than they used to. But if someone can both work toward something that they think will be really valuable and build flexible career capital at the same time, that seems all the better.
Ideally doing both is definitely nice—and I think it’s true that the trade off is definitely important. As I mentioned above in my comments to Rob, targeted advice may solve the trade-off question.
If I remember correctly, 80,000 Hours has stated that they think 15% of people in the EA Community should be pursuing earning to give. Have they revised this opinion or am I remembering it incorrectly?
If not, your description seems a bit misleading to me. Substantial number sounds like a significantly higher fraction of people to me, perhaps something like 40% instead of 15%.
Yes- thanks ardenlk for your article- but I personally would find it helpful to see more quantification in 80000 hours research- as Denise says would be good to know what fraction is your current view and the rationale/figures behind it-perhaps including your estimates for some of the following:
size of EA population pursuing roles you recommend, number of such roles available over a year, probability of average EA landing these roles over a year (perhaps this may be lower than the number of roles/applicants due to some applicants being more qualified/experienced than the average EA), the monetary impact of each of the roles you recommend for EAs to compare to earning to give. I think this would help EAs to make a more informed decision about their career choice.
Apologies if you’ve already published these or similar figures and I havent seen. Perhaps there is a fear of not wanting to be a hostage to these numbers, but I think its fine to change your mind/estimates as the facts change (as per John Maynard Keynes), and understandable given that research into effective careers is at an early stage.
I think it would be misleading if OP had said ‘substantial proportion’. I read ‘substantial number’ as a comment on the absolute numbers, which is vague (how many is ‘substantial’) but not misleading.
I think this is the article you’re thinking about, where they’re talking about the paths of marginal graduates. Note that it’s from 2015 (though at least Will said he still thought it seemed right in 2016) and explicitly labeled with “Please note that this is just a straw poll used as a way of addressing the misconception stated; it doesn’t represent a definitive answer to this question”.
As a problem with the ‘big list’, you mention
But it seems like there’s another problem, closely related to this one: for every reader, the paths on such a list could have different orderings. If someone has a comparative advantage for a role, it doesn’t necessarily mean that they can’t aim for other roles: but it might mean that they should prefer the role that they have a comparative advantage for. This is especially true once we consider that most people don’t know exactly what they could do and what they’d be good at—instead, their personal lists contains a bunch of things they could aim for, ordered according to different probabilities of having different amounts of impact.
In particular, I think it’s a bad idea to take a ‘big list’, winnow away all the jobs that looks impossible, and then aim for whatever is on top of the list. Instead, your personal list might overlap with others’, but have a completely different ordering (yet hopefully contain a few items that other people haven’t even considered, given that 80k can’t evaluate all opportunities, like you say).
Really good article. I have been critical of 80K hours in the past but this article caused me to substantially update my views. I am happy to hear you will be at 80K hours.
This is great, thanks.