This is because I think that we are not able to evaluate what replacement candidate would fill the role if the employed EA had done e2g.
Idk I feel like you can get a decent sense of this from running hiring rounds with lots of work tests. I think many talented EAs are looking for EA jobs, but often it’s a question of “fit” over just raw competence.
> My understanding is that many non-EA jobs provide useful knowledge and skills that are underrepresented in current EA organizations, albeit my impression is that this is improving as EA organizations professionalize
This seems plausible, though I personally think it’s somewhat overstated on the forum. I agree that more EAs should be “skill maxing” over direct work or e2g, but I don’t think we should use e2g as a shorthand for optimising for developing valuable skills in the short term.
I think many talented EAs are looking for EA jobs, but often it’s a question of “fit” over just raw competence.
For the significant majority of EAs, does there exist an “EA job” that is a sufficiently good fit as to be superior to the individual’s EtG alternative? To count, the job needs to be practically obtainable (e.g., the job is funded, the would-be worker can get it, the would-be worker does not have personal characteristics or situations that prevent them from accepting the job or doing it well).
I would find it at least mildly surprising for the closeness of fit between the personal characteristics of the EA population and the jobs available to be that tight.[1]
For most social movements, funding only allows a small percentage of the potentially-interested population to secure employment in the movement (such as clergy or other religious workers in a religious movement. So they do not face this sort of question. But I’d be skeptical that (e.g.) 85% of pretty religious people are well-suited to work as clergy or in other religious occupations.
I don’t understand why this is relevant to the question of whether there are enough people doing e2g. Clearly there are many useful direct impact or skill building jobs that aren’t at ea orgs. E.g. working as a congressional staffer.
I wouldn’t find it surprising at all if most EAs are a good fit for good non e2g roles. In fact, earning a lot of money is quite hard, I expect most people won’t be a very good fit for it.
I think we’re talking past each other when we say “ea job”, but if you mean job at an ea org I’d agree there aren’t enough roles for everyone, but most useful direct work/skill building roles aren’t at ea orgs so it doesn’t seem very relevant, and if you mean directly impactful job or useful for skill building your claim seems wrong, seems like there are many jobs that will be better fits for people than e2g motivated ones (imo).
I agree that we shouldn’t use e2g as a shorthand for skillmaxing.
I am less optimistic about the ‘fit’ vs raw competence point. It’s not clear to me that a good fit for the work position can easily be gleaned by work tests—a very competent person may be able to acquire that ‘fit’ within a few weeks on the job, for example, once they have more context for the kind of work the organization wants. So even if the candidates at the point of hiring looked very different, their comparison may differ unless we imagine both in an applied job context, having learned things they did not know at the time of hiring.
I am more broadly worried about ‘fit’ in EA hiring contexts, because as opposed to markers of raw competence, ‘fit’ provides a lot of flexibility for selecting traits that are relatively tangential to work performance and often unreliable. For example, value-fit might select for hiring likeminded folks who have read the same stuff the hiring manager has, and reduce epistemic diversity. A fit for similar research interests reduces epistemic diversity and locks in certain research agendas for a long time. A vibe-fit may select simply for friends and those who have internalized norms. A worktest that is on an explicitly EA project may select for those already more familiar with EA, even if it would be easy for an outsider candidate to pick up on basic EA knowledge quickly if they got the job.
My impression is that overall, EA does have a noticeable suboptimal tendency to hire likeminded folks and folks in overlapping social circles (i.e. friends; friends of friends). Insofar as ‘fit’ makes it easier to justify this tendency internally and externally, I worry that it will lead to suboptimal hiring. I acknowledge we may have very different kinds of ‘fit’ in mind here. I do think the examples I provide above do exist in EA hiring decisions.
I haven’t done hiring rounds for EA, so I may be completely wrong—maybe your experience has been that after a few worktests it becomes abundantly clear who the right candidate is.
Idk I feel like you can get a decent sense of this from running hiring rounds with lots of work tests. I think many talented EAs are looking for EA jobs, but often it’s a question of “fit” over just raw competence.
> My understanding is that many non-EA jobs provide useful knowledge and skills that are underrepresented in current EA organizations, albeit my impression is that this is improving as EA organizations professionalize
This seems plausible, though I personally think it’s somewhat overstated on the forum. I agree that more EAs should be “skill maxing” over direct work or e2g, but I don’t think we should use e2g as a shorthand for optimising for developing valuable skills in the short term.
For the significant majority of EAs, does there exist an “EA job” that is a sufficiently good fit as to be superior to the individual’s EtG alternative? To count, the job needs to be practically obtainable (e.g., the job is funded, the would-be worker can get it, the would-be worker does not have personal characteristics or situations that prevent them from accepting the job or doing it well).
I would find it at least mildly surprising for the closeness of fit between the personal characteristics of the EA population and the jobs available to be that tight.[1]
For most social movements, funding only allows a small percentage of the potentially-interested population to secure employment in the movement (such as clergy or other religious workers in a religious movement. So they do not face this sort of question. But I’d be skeptical that (e.g.) 85% of pretty religious people are well-suited to work as clergy or in other religious occupations.
I don’t understand why this is relevant to the question of whether there are enough people doing e2g. Clearly there are many useful direct impact or skill building jobs that aren’t at ea orgs. E.g. working as a congressional staffer.
I wouldn’t find it surprising at all if most EAs are a good fit for good non e2g roles. In fact, earning a lot of money is quite hard, I expect most people won’t be a very good fit for it.
I think we’re talking past each other when we say “ea job”, but if you mean job at an ea org I’d agree there aren’t enough roles for everyone, but most useful direct work/skill building roles aren’t at ea orgs so it doesn’t seem very relevant, and if you mean directly impactful job or useful for skill building your claim seems wrong, seems like there are many jobs that will be better fits for people than e2g motivated ones (imo).
I agree that we shouldn’t use e2g as a shorthand for skillmaxing.
I am less optimistic about the ‘fit’ vs raw competence point. It’s not clear to me that a good fit for the work position can easily be gleaned by work tests—a very competent person may be able to acquire that ‘fit’ within a few weeks on the job, for example, once they have more context for the kind of work the organization wants. So even if the candidates at the point of hiring looked very different, their comparison may differ unless we imagine both in an applied job context, having learned things they did not know at the time of hiring.
I am more broadly worried about ‘fit’ in EA hiring contexts, because as opposed to markers of raw competence, ‘fit’ provides a lot of flexibility for selecting traits that are relatively tangential to work performance and often unreliable. For example, value-fit might select for hiring likeminded folks who have read the same stuff the hiring manager has, and reduce epistemic diversity. A fit for similar research interests reduces epistemic diversity and locks in certain research agendas for a long time. A vibe-fit may select simply for friends and those who have internalized norms. A worktest that is on an explicitly EA project may select for those already more familiar with EA, even if it would be easy for an outsider candidate to pick up on basic EA knowledge quickly if they got the job.
My impression is that overall, EA does have a noticeable suboptimal tendency to hire likeminded folks and folks in overlapping social circles (i.e. friends; friends of friends). Insofar as ‘fit’ makes it easier to justify this tendency internally and externally, I worry that it will lead to suboptimal hiring. I acknowledge we may have very different kinds of ‘fit’ in mind here. I do think the examples I provide above do exist in EA hiring decisions.
I haven’t done hiring rounds for EA, so I may be completely wrong—maybe your experience has been that after a few worktests it becomes abundantly clear who the right candidate is.