I guess in terms of this metaphor, part of what Iâm saying is that there are also some people who arenât âin the raceâ but really would do great if they joined it (maybe after a few false starts), and other people who are in the race but would be better off switching to tennis instead (and thatâs great too!).
And Iâm a little worried about saying something like âHey, the race is super hard! But donât feel bad, you can go train up somewhere else for a while, and then come back!â
Because some people could do great in the race already! (Even if several applications donât work out or whatever; thereâs a lot of variation between what different roles need, and a lot of random chance, etc.) And some of these people are erring in the direction of self-selecting out, feeling imposter syndrome, getting 4 job rejections and then thinking âwell, that proves it, Iâm not right for these rolesâ, etc.
Meanwhile, other people shouldnât âtrain up and come backâ, but rather go do great things elsewhere long-term! (Not necessarily leaving the community, but just not working at an explicitly EA org or with funding from EA funders.) And some of these people are erring in the direction of having their heart set of getting into an âEA jobâ eventually, even if they have to train up elsewhere first.
---
Iâd also be worried about messaging like âEveryone needs to get in this particular race right now! We have lots of money and lots to do and yâall need to come over here!â And it definitely seems good to push against that. But I think we can try to find a middle ground that acknowledges there are many different paths, and different ones will be better for different people at different times, and thatâs ok. (E.g., I think this post by Rob Wiblin does that nicely.)
Figuring out how to give the right advice to the right person is a hard challenge. Thatâs why I framed skilling up outside EA as being a good alternative to âbanging your head against the wall indefinitely.â I think the link I added to the bottom of this post addresses the âmany pathsâ component.
The main goal of my post, though, is to talk about why thereâs a bar (hurdle rate) in the first place. And, if readers are persuaded of its necessity, to suggest what to do if youâve become convinced that you canât surpass it at this stage in your journey.
It would be helpful to find a test to distinguish EAs who should keep trying from those who should exit, skill up, and return later. Probably one-on-one mentorship, coupled with data on what sorts of things EA orgs look for in an applicant, and the distribution of applicant quality, would be the way to devise such a test.
A team capable of executing a high-quality project to create such a test would (if I were an EA fund) definitely be worthy of a grant!
Also, hereâs a somewhat relevant intervention idea that seems interesting to me, copied from an upcoming post in my sequence on improving the EA-aligned research pipeline (so this passage focuses on research roles, but you can easily extrapolate the ideas):
Improving the vetting of (potential) researchers, and/âor better âsharingâ that vetting
For example:
Improving selection processes at EA-aligned research organisations
Increasing the number and usefulness of referrals of candidates from one selection process (e.g., for a job or a grant) to another selection process.
This already happens, but could perhaps be improved by:
Increasing how often it happens
Increasing how well-targeted the referrals are
Increasing the amount of information provided to the second selection process?
Increasing how much of the second selection process the candidate can âskipâ?
Creating something like a âTriplebyte for EA researchersâ, which could scalably evaluate aspiring/âjunior researchers, identify talented/âpromising ones, and then recommend them to hirers/âgrantmakers^[This idea was suggested as a possibility by a commenter on a draft of this post.]
This could resolve most of the vetting constraints if it could operate efficiently and was trusted by the relevant hirers/âgrantmakers
Triplebyteâs value proposition to its clients (the companies who pay for its services) is an improved technical interview process. They claim to offer tests that achieve three forms of value:
Less biased
More predictive of success-linked technical prowess
Convenient (since companies donât have to run the technical interviews themselves)
If thereâs room for an âEA Triplebyte,â that would suggest that EA orgs have at least one of those three problems.
So it seems like your first step would be to look in-depth at the ways EA orgs assess technical research skills.
Are they looking at the same sorts of skills? Are their tests any good? Are the tests time-consuming and burdensome for EA orgs? Alternatively, do many EA orgs pass up on needed hires because they donât have the short-term capacity to evaluate them?
Then youâd need to consider what alternative tests would be a better measurement of technical research prowess, and how to show that they are better predictive of success than present technical interviews.
It would also be important to determine the scale of the problem. Eyeballing this list, thereâs maybe 75 EA-related organizations. How many hires do they make per month? How often does their search fail for lack of qualified candidates? How many hours do they spend on technical interviews each time? Will you be testing not for EA-specific for general research capacity (massively broadening your market, but also increasing the challenge of addressing all their needs)?
Finally, youâd need to roll that up into a convenient, trustiworthy and reliable package that clients are excited to use instead of their current approach.
This seems like a massive amount of work, demanding a strong team, adequate funding and prior interest by EA orgs, and long-term commitment. It also sounds like it might be really valuable if done well.
Thanks for these thoughtsâI think Iâll add a link to your comment from that section of my post.
I think your analysis basically sounds correct to me. I would also be quite surprised if this came into existence (and was actually used by multiple orgs) in the next 10 years, and I donât think itâs likely to be the highest priority intervention for improving the EA-aligned research pipeline, though Iâd be keen to at least see people flesh out and explore the idea a bit more.
FWIW, Iâm guessing that this commenter on my doc meant something a little bit more distant from Triplebyte specifically than what your comment suggestsâin particular, I donât think the idea would be just to conduct technical interviews, but also other parts of the selection process. At least, thatâs how I interpreted the comment, and seems better to me, given that I think itâs relatively rare for EA orgs to actually have technical interviews in their selection processes. (There may often be a few questions like that, but without it being the main focus for the interview. Though I also might be misinterpreting what a technical interview isâI havenât worked in areas like engineering or IT.)
My sense is that Triplebyte focuses on âcan this person think like an engineerâ and âwhich specific math/âprogramming skills do they have, and how strong are they?â Then companies do a second round of interviews where they evaluate Triplebyte candidates for company culture. Triplebyte handles the general, companies handle the idiosyncratic.
It just seems to me that Triplebyte is powered by a mature industry thatâs had decades of time and massive amounts of money invested into articulating its own needs and interests. Whereas I donât think EA is old or big or wealthy enough to have a sharp sense of exactly what the stable needs are.
For a sense of scale, there are almost 4 million programmers in the USA. Triplebyte launched just 5 years ago. It took millions of people working as programmers to generate adequate demand and capacity for that service to be successful.
All in all, my guess is that what weâre missing is charismatic founder-types. The kind of people who can take one of the problems on our long lists of cause areas, turn it into a real plan, pull together funding and a team (of underutilized people), and make it go.
Figuring out how to teach that skill, or replace it with some other foundation mechanism, would of course be great. Itâs necessary. Otherwise, weâre kind of just cannibalizing one highly-capable project to create another. Which is pretty much what we do when we try to attract strong outside talent and âconvertâ them to EA.
Part of the reason I havenât spent more time trying to found something right off the bat is that I thought EA could benefit more if I developed a skillset in technology. But another reason is that I just donât have the slack. I think to found something, you need significant savings and a clear sense of what to do if it fails, such that you can afford to take years of your life, potentially, without a real income.
Most neophytes donât have that kind of slack. Thatâs why I especially lean on the side of âif it hurts, donât do it.â
I donât have any negativity toward the encouragement to try things and be audacious. At the same time, thereâs a massive amount of hype and exploitative stuff in the entrepreneurship world. This âThink of the guy who wrote Winzip! He made millions of dollars, and you can do it too!â line that business gurus use to suck people in to their self-help sites and Youtube channels and so on.
The EA movement had some low-hanging fruit to pick early on. Itâs obviously a huge win for us to have great resources like 80k, or significant organizations like OpenPhil. Some of these were founded by world-class experts (Pete Singer) and billionaires, but some (80k) were founded by some young audacious people not too far out of grad school. But those needs, it seems to me, are filled. The worldâs pretty rich. Itâs easier to address a funding shortfall or an information shortfall, than to get concrete useful direct work done.
Likewise in the business world, itâs easier to find money for a project and outline the general principles of how to run a good business, than to actually develop and successfully market a valuable new product. Thereâs plenty of money out there, and not a ton of obvious choices to spend it on. Silicon Valleyâs looking for unicorns. Weâre looking for unicorns too. There arenât many unicorns.
I think that the âEA establishmentâsâ responsibility to neophytes is to tell them frankly that thereâs a very high bar, itâs there for a reason, and for your own sake, donât hurt yourself over and over by failing to clear it. Go make yourself big and strong somewhere else, then come back here and show us what you can do. Tell people itâs hard, and invite them back when theyâre ready for that kind of challenge.
I think to found something, you need significant savings and a clear sense of what to do if it fails, such that you can afford to take years of your life, potentially, without a real income.
I donât think this is true, at least not as a general rule. I think you can do both (have a safe career and pursue something entrepreneurial) if you make small, focused bets to begin with and build from there. Related discussion here.
My sense is that Triplebyte focuses on âcan this person think like an engineerâ and âwhich specific math/âprogramming skills do they have, and how strong are they?â Then companies do a second round of interviews where they evaluate Triplebyte candidates for company culture. Triplebyte handles the general, companies handle the idiosyncratic.
I used to work as an interviewer for TripleByte. Most companies using TripleByte put TripleByte-certified candidates through their standard technical onsite. From what I was able to gather, the value prop for companies working with TripleByte is mostly about 1. expanding their sourcing pipeline to include more quality candidates and 2. cutting down on the amount of time their engineers spend administering screens to candidates who arenât very good.
Some of your comments make it sound like a TB like service for EA has to be a lot better than what EA orgs are currently doing to screen candidates. Personally, I suspect thereâs a lot of labor-saving value to capture if it is merely just as good (or even a bit worse) than current screens. It might also help organizations consider a broader range of people.
Thanks for that context, John. Given that value prop, companies might use a TB-like service under two constraints:
They are bottlenecked by having too few applicants. In this case, they have excess interviewing capacity, or more jobs than applicants. They hope that by investigating more applicants through TB, they can find someone outstanding.
Their internal headhunting process has an inferior quality distribution relative to the candidates they get through TB. In this case, they believe that TB can provide them with a better class of applicants than their own job search mechanisms can identify. In effect, they are outsourcing their headhunting for a particular job category.
Given that EA orgs seem primarily to lack specific forms of domain expertise, as well as well-defined project ideas/âteams, what would an EA Triplebyte have to achieve?
Theyâd need to be able to interface with EA orgs and identify the specific forms of domain expertise that are required. Then theyâd need to be able to go out and recruit those experts, who might never have heard of EA, and get them interested in the job. Theyâd be an interface to the expertise these orgs require. Push a button, get an expert.
That seems plausible. Triplebyte evokes the image of a huge recruiting service meant to fill cubicles with basically-competent programmers who are pre-screened for the in-house technical interview. Not to find unusually specific skills for particular kinds of specialist jobs, which it seems is what EA requires at this time.
That sort of headhunting job could be done by just one person. Their job would be to do a whole lot of cold-calling, getting meetings with important people, doing the legwork that EA orgs donât have time for. Need five minutes of a Senatorâs time? Looking to pull together a conference of immunologists to discuss biosafety issues from an EA perspective? Thatâs the sort of thing this sort of org would strive to make more convenient for EA orgs.
As they gained experience, they would also be able to help EA orgs anticipate what sort of projects the domain experts theyâd depend upon would be likely to spring for. I imagine that some EA orgs must periodically come up with, say, ideas that would require some significant scientific input. Some of those ideas might be more attractive to the scientists than others. If an org like this existed, it might be able to tell those EA orgs which ones the scientists are likely to spring for.
That does seem like the kind of job that could productively exist at the intersection of EA orgs. Theyâd need to understand EA concepts and the relationships between institutions well enough to speak âon behalf of the movement,â while gaining a similar understanding of domains like the scientific, political, business, philanthropic, or military establishment of particular countries.
I agree that there are fewer lower hanging fruit than there used to be. On the other hand, thereâs more guidance on what to do and more support for how to do it (perhaps âbetter maps to the treesâ and âbetter laddersââI think Iâm plagiarising someone else on the ladder bit). Iâd guess that it is now overall somewhat or significantly harder for someone in the position Ben Todd was in to make something as useful as 80k, but it doesnât seem totally clear.
And in any case, something as useful as 80k is a high bar! I think something could be much less useful and still be very useful. And someone perhaps could âskill upâ more than Ben Todd had, but only for like a couple years.
And I think there really are still a lot of fairly low hanging fruit. I think some evidence for this is the continuing number of EA projects that seem to fill niches that seem like they obviously should be filled, seem to be providing value, and are created by pretty early-career people. (I can expand on this if you want, but I think e.g. looking at lists of EA orgs already gives a sense of what I mean.)
I agree with many parts of your comment, but I continue to think only some sizeable fraction of people should be advised to âGo make yourself big and strong somewhere else, then come back here and show us what you can doâ, while also:
many people should try both approaches at first
many people should focus mostly on the explicitly EA paths (usually after trying both approaches and getting some evidence about comparative advantage)
many people should go make themselves big and strong and impactful somewhere else,and then just stay there, doing great stuff
I think itâs perhaps a little irresponsible to give public advice thatâs narrower than thatânarrower advice makes sense if youâre talking to a specific person and you have evidence about which of those categories of people theyâre part of, but not for a public audience.
(I think itâs also fine to give public advice like âon the margin, somewhat more people should be doing X, and some ways to tell if you specifically should be doing X are Y and Zâ. I think 80kâs advice tends to look like that. Though even that often gets boiled down by other people to âquick, everyone should do X!â, and then creates problems.)
I do think getting it would be good to get more clarity on what proportion of EAs are spending too much vs too little time pursuing explicitly EA-aligned roles (given their ultimate goals, fit, etc.), and more clarity on what proxies can be used to help people work out which group theyâre in.
Though I think some decent insights can already be gleaned from, for example, posts tagged Working at EA vs Non-EA Orgs or things linked to from those posts, and one on one career advice conversations.
(And I think we can also improve on the current situationâwhere some people are writing themselves off and others have too narrow a focusâby just making sure we always clearly acknowledge individual variation, there being many different good paths, it taking time to work out what one is a fit for, etc.)
(I also like that Slate Star Codex post you link to, and agree that itâs relevant here.)
I guess in terms of this metaphor, part of what Iâm saying is that there are also some people who arenât âin the raceâ but really would do great if they joined it (maybe after a few false starts), and other people who are in the race but would be better off switching to tennis instead (and thatâs great too!).
And Iâm a little worried about saying something like âHey, the race is super hard! But donât feel bad, you can go train up somewhere else for a while, and then come back!â
Because some people could do great in the race already! (Even if several applications donât work out or whatever; thereâs a lot of variation between what different roles need, and a lot of random chance, etc.) And some of these people are erring in the direction of self-selecting out, feeling imposter syndrome, getting 4 job rejections and then thinking âwell, that proves it, Iâm not right for these rolesâ, etc.
Meanwhile, other people shouldnât âtrain up and come backâ, but rather go do great things elsewhere long-term! (Not necessarily leaving the community, but just not working at an explicitly EA org or with funding from EA funders.) And some of these people are erring in the direction of having their heart set of getting into an âEA jobâ eventually, even if they have to train up elsewhere first.
---
Iâd also be worried about messaging like âEveryone needs to get in this particular race right now! We have lots of money and lots to do and yâall need to come over here!â And it definitely seems good to push against that. But I think we can try to find a middle ground that acknowledges there are many different paths, and different ones will be better for different people at different times, and thatâs ok. (E.g., I think this post by Rob Wiblin does that nicely.)
Figuring out how to give the right advice to the right person is a hard challenge. Thatâs why I framed skilling up outside EA as being a good alternative to âbanging your head against the wall indefinitely.â I think the link I added to the bottom of this post addresses the âmany pathsâ component.
The main goal of my post, though, is to talk about why thereâs a bar (hurdle rate) in the first place. And, if readers are persuaded of its necessity, to suggest what to do if youâve become convinced that you canât surpass it at this stage in your journey.
It would be helpful to find a test to distinguish EAs who should keep trying from those who should exit, skill up, and return later. Probably one-on-one mentorship, coupled with data on what sorts of things EA orgs look for in an applicant, and the distribution of applicant quality, would be the way to devise such a test.
A team capable of executing a high-quality project to create such a test would (if I were an EA fund) definitely be worthy of a grant!
Also, hereâs a somewhat relevant intervention idea that seems interesting to me, copied from an upcoming post in my sequence on improving the EA-aligned research pipeline (so this passage focuses on research roles, but you can easily extrapolate the ideas):
Improving the vetting of (potential) researchers, and/âor better âsharingâ that vetting
For example:
Improving selection processes at EA-aligned research organisations
Increasing the number and usefulness of referrals of candidates from one selection process (e.g., for a job or a grant) to another selection process.
This already happens, but could perhaps be improved by:
Increasing how often it happens
Increasing how well-targeted the referrals are
Increasing the amount of information provided to the second selection process?
Increasing how much of the second selection process the candidate can âskipâ?
Creating something like a âTriplebyte for EA researchersâ, which could scalably evaluate aspiring/âjunior researchers, identify talented/âpromising ones, and then recommend them to hirers/âgrantmakers^[This idea was suggested as a possibility by a commenter on a draft of this post.]
This could resolve most of the vetting constraints if it could operate efficiently and was trusted by the relevant hirers/âgrantmakers
Triplebyteâs value proposition to its clients (the companies who pay for its services) is an improved technical interview process. They claim to offer tests that achieve three forms of value:
Less biased
More predictive of success-linked technical prowess
Convenient (since companies donât have to run the technical interviews themselves)
If thereâs room for an âEA Triplebyte,â that would suggest that EA orgs have at least one of those three problems.
So it seems like your first step would be to look in-depth at the ways EA orgs assess technical research skills.
Are they looking at the same sorts of skills? Are their tests any good? Are the tests time-consuming and burdensome for EA orgs? Alternatively, do many EA orgs pass up on needed hires because they donât have the short-term capacity to evaluate them?
Then youâd need to consider what alternative tests would be a better measurement of technical research prowess, and how to show that they are better predictive of success than present technical interviews.
It would also be important to determine the scale of the problem. Eyeballing this list, thereâs maybe 75 EA-related organizations. How many hires do they make per month? How often does their search fail for lack of qualified candidates? How many hours do they spend on technical interviews each time? Will you be testing not for EA-specific for general research capacity (massively broadening your market, but also increasing the challenge of addressing all their needs)?
Finally, youâd need to roll that up into a convenient, trustiworthy and reliable package that clients are excited to use instead of their current approach.
This seems like a massive amount of work, demanding a strong team, adequate funding and prior interest by EA orgs, and long-term commitment. It also sounds like it might be really valuable if done well.
Thanks for these thoughtsâI think Iâll add a link to your comment from that section of my post.
I think your analysis basically sounds correct to me. I would also be quite surprised if this came into existence (and was actually used by multiple orgs) in the next 10 years, and I donât think itâs likely to be the highest priority intervention for improving the EA-aligned research pipeline, though Iâd be keen to at least see people flesh out and explore the idea a bit more.
FWIW, Iâm guessing that this commenter on my doc meant something a little bit more distant from Triplebyte specifically than what your comment suggestsâin particular, I donât think the idea would be just to conduct technical interviews, but also other parts of the selection process. At least, thatâs how I interpreted the comment, and seems better to me, given that I think itâs relatively rare for EA orgs to actually have technical interviews in their selection processes. (There may often be a few questions like that, but without it being the main focus for the interview. Though I also might be misinterpreting what a technical interview isâI havenât worked in areas like engineering or IT.)
My sense is that Triplebyte focuses on âcan this person think like an engineerâ and âwhich specific math/âprogramming skills do they have, and how strong are they?â Then companies do a second round of interviews where they evaluate Triplebyte candidates for company culture. Triplebyte handles the general, companies handle the idiosyncratic.
It just seems to me that Triplebyte is powered by a mature industry thatâs had decades of time and massive amounts of money invested into articulating its own needs and interests. Whereas I donât think EA is old or big or wealthy enough to have a sharp sense of exactly what the stable needs are.
For a sense of scale, there are almost 4 million programmers in the USA. Triplebyte launched just 5 years ago. It took millions of people working as programmers to generate adequate demand and capacity for that service to be successful.
All in all, my guess is that what weâre missing is charismatic founder-types. The kind of people who can take one of the problems on our long lists of cause areas, turn it into a real plan, pull together funding and a team (of underutilized people), and make it go.
Figuring out how to teach that skill, or replace it with some other foundation mechanism, would of course be great. Itâs necessary. Otherwise, weâre kind of just cannibalizing one highly-capable project to create another. Which is pretty much what we do when we try to attract strong outside talent and âconvertâ them to EA.
Part of the reason I havenât spent more time trying to found something right off the bat is that I thought EA could benefit more if I developed a skillset in technology. But another reason is that I just donât have the slack. I think to found something, you need significant savings and a clear sense of what to do if it fails, such that you can afford to take years of your life, potentially, without a real income.
Most neophytes donât have that kind of slack. Thatâs why I especially lean on the side of âif it hurts, donât do it.â
I donât have any negativity toward the encouragement to try things and be audacious. At the same time, thereâs a massive amount of hype and exploitative stuff in the entrepreneurship world. This âThink of the guy who wrote Winzip! He made millions of dollars, and you can do it too!â line that business gurus use to suck people in to their self-help sites and Youtube channels and so on.
The EA movement had some low-hanging fruit to pick early on. Itâs obviously a huge win for us to have great resources like 80k, or significant organizations like OpenPhil. Some of these were founded by world-class experts (Pete Singer) and billionaires, but some (80k) were founded by some young audacious people not too far out of grad school. But those needs, it seems to me, are filled. The worldâs pretty rich. Itâs easier to address a funding shortfall or an information shortfall, than to get concrete useful direct work done.
Likewise in the business world, itâs easier to find money for a project and outline the general principles of how to run a good business, than to actually develop and successfully market a valuable new product. Thereâs plenty of money out there, and not a ton of obvious choices to spend it on. Silicon Valleyâs looking for unicorns. Weâre looking for unicorns too. There arenât many unicorns.
I think that the âEA establishmentâsâ responsibility to neophytes is to tell them frankly that thereâs a very high bar, itâs there for a reason, and for your own sake, donât hurt yourself over and over by failing to clear it. Go make yourself big and strong somewhere else, then come back here and show us what you can do. Tell people itâs hard, and invite them back when theyâre ready for that kind of challenge.
I donât think this is true, at least not as a general rule. I think you can do both (have a safe career and pursue something entrepreneurial) if you make small, focused bets to begin with and build from there. Related discussion here.
I agree, I should have included âor a safe career/âfallback optionâ to that.
I used to work as an interviewer for TripleByte. Most companies using TripleByte put TripleByte-certified candidates through their standard technical onsite. From what I was able to gather, the value prop for companies working with TripleByte is mostly about 1. expanding their sourcing pipeline to include more quality candidates and 2. cutting down on the amount of time their engineers spend administering screens to candidates who arenât very good.
Some of your comments make it sound like a TB like service for EA has to be a lot better than what EA orgs are currently doing to screen candidates. Personally, I suspect thereâs a lot of labor-saving value to capture if it is merely just as good (or even a bit worse) than current screens. It might also help organizations consider a broader range of people.
Thanks for that context, John. Given that value prop, companies might use a TB-like service under two constraints:
They are bottlenecked by having too few applicants. In this case, they have excess interviewing capacity, or more jobs than applicants. They hope that by investigating more applicants through TB, they can find someone outstanding.
Their internal headhunting process has an inferior quality distribution relative to the candidates they get through TB. In this case, they believe that TB can provide them with a better class of applicants than their own job search mechanisms can identify. In effect, they are outsourcing their headhunting for a particular job category.
Given that EA orgs seem primarily to lack specific forms of domain expertise, as well as well-defined project ideas/âteams, what would an EA Triplebyte have to achieve?
Theyâd need to be able to interface with EA orgs and identify the specific forms of domain expertise that are required. Then theyâd need to be able to go out and recruit those experts, who might never have heard of EA, and get them interested in the job. Theyâd be an interface to the expertise these orgs require. Push a button, get an expert.
That seems plausible. Triplebyte evokes the image of a huge recruiting service meant to fill cubicles with basically-competent programmers who are pre-screened for the in-house technical interview. Not to find unusually specific skills for particular kinds of specialist jobs, which it seems is what EA requires at this time.
That sort of headhunting job could be done by just one person. Their job would be to do a whole lot of cold-calling, getting meetings with important people, doing the legwork that EA orgs donât have time for. Need five minutes of a Senatorâs time? Looking to pull together a conference of immunologists to discuss biosafety issues from an EA perspective? Thatâs the sort of thing this sort of org would strive to make more convenient for EA orgs.
As they gained experience, they would also be able to help EA orgs anticipate what sort of projects the domain experts theyâd depend upon would be likely to spring for. I imagine that some EA orgs must periodically come up with, say, ideas that would require some significant scientific input. Some of those ideas might be more attractive to the scientists than others. If an org like this existed, it might be able to tell those EA orgs which ones the scientists are likely to spring for.
That does seem like the kind of job that could productively exist at the intersection of EA orgs. Theyâd need to understand EA concepts and the relationships between institutions well enough to speak âon behalf of the movement,â while gaining a similar understanding of domains like the scientific, political, business, philanthropic, or military establishment of particular countries.
An EA diplomat.
I agree that there are fewer lower hanging fruit than there used to be. On the other hand, thereâs more guidance on what to do and more support for how to do it (perhaps âbetter maps to the treesâ and âbetter laddersââI think Iâm plagiarising someone else on the ladder bit). Iâd guess that it is now overall somewhat or significantly harder for someone in the position Ben Todd was in to make something as useful as 80k, but it doesnât seem totally clear.
And in any case, something as useful as 80k is a high bar! I think something could be much less useful and still be very useful. And someone perhaps could âskill upâ more than Ben Todd had, but only for like a couple years.
And I think there really are still a lot of fairly low hanging fruit. I think some evidence for this is the continuing number of EA projects that seem to fill niches that seem like they obviously should be filled, seem to be providing value, and are created by pretty early-career people. (I can expand on this if you want, but I think e.g. looking at lists of EA orgs already gives a sense of what I mean.)
I agree with many parts of your comment, but I continue to think only some sizeable fraction of people should be advised to âGo make yourself big and strong somewhere else, then come back here and show us what you can doâ, while also:
many people should try both approaches at first
many people should focus mostly on the explicitly EA paths (usually after trying both approaches and getting some evidence about comparative advantage)
many people should go make themselves big and strong and impactful somewhere else, and then just stay there, doing great stuff
I think itâs perhaps a little irresponsible to give public advice thatâs narrower than thatânarrower advice makes sense if youâre talking to a specific person and you have evidence about which of those categories of people theyâre part of, but not for a public audience.
(I think itâs also fine to give public advice like âon the margin, somewhat more people should be doing X, and some ways to tell if you specifically should be doing X are Y and Zâ. I think 80kâs advice tends to look like that. Though even that often gets boiled down by other people to âquick, everyone should do X!â, and then creates problems.)
I do think getting it would be good to get more clarity on what proportion of EAs are spending too much vs too little time pursuing explicitly EA-aligned roles (given their ultimate goals, fit, etc.), and more clarity on what proxies can be used to help people work out which group theyâre in.
Though I think some decent insights can already be gleaned from, for example, posts tagged Working at EA vs Non-EA Orgs or things linked to from those posts, and one on one career advice conversations.
(And I think we can also improve on the current situationâwhere some people are writing themselves off and others have too narrow a focusâby just making sure we always clearly acknowledge individual variation, there being many different good paths, it taking time to work out what one is a fit for, etc.)
(I also like that Slate Star Codex post you link to, and agree that itâs relevant here.)