The EA Infrastructure Fund will fund and support projects that build and empower the community of people trying to identify actions that do the greatest good from a scope-sensitive and impartial welfarist view.
And a rationale (there’s more detail in the post):
[...] EA is doing something special.
[...] fighting for EA right now could make it meaningfully more likely to thrive long term.
[...] we could make EA much better than it currently is—particularly on the “beacon for thoughtful, sincere, and selfless” front. [...]
Here I’m spending some time thinking about this, in particular:
What does it mean for EA to thrive?
What projects could push EA in the direction of thriving?
(I work at EAIF. These are my personal views/ thoughts. I’m not speaking on behalf of EAIF here)
There’s a thing where lots of people will say that they are EA Adjacent rather than EA (funny post related to this). In particular, it seems to me that the closer to the core people are, the less inclined they are to identify themselves with EA. What’s going on here? I don’t know, but it’s an interesting trailhead to me.
Plausibly there are some aspects of EA, the culture, norms, worldview, individuals, organisations etc. that people disagree with or don’t endorse, and so prefer to not identify as EAs.
I’m unsure how much to treat this as reflective of a substantive issue vs. a quirk, or reflective of things being actually fine. At least in terms of EA being a ‘beacon for thoughtful, sincere, and selfless’, it seems a little bit worrying to me that some of the core members of the community aren’t willing to describe themselves as EA.
Perhaps a way of getting to the heart of this is asking people something like: Imagine you’re talking to someone who is thoughtful, sincere and selfless. Would you recommend EA to them? Which parts? How strongly? Would you express any reservations?
Looping back to the question of ‘What is it for EA to thrive?’, one answer is: It’s the kind of community that EA’s would strongly recommend to a thoughtful, sincere and selfless friend.
(Maybe this is too strong—people will probably reasonably have disagreements about what aspects of EA are good and aren’t, and if everyone is very positive on EA in this way, this plausibly means that there’s not enough disagreement in the community. )
In particular, it seems to me that the closer to the core people are, the less inclined they are to identify themselves with EA. What’s going on here? I don’t know, but it’s an interesting trailhead to me.
I think this implies that there is a substantive non-quirky effect. That said, I imagine some of this may be explained by new EAs simply being particularly enthusiastic in ways which explain stronger identification with EA and higher satisfaction.[1]
One dynamic which I expect explains this is the narcissism of small differences, as people become closer to EA, differences and disagreements become more salient, and so people may become more inclined to want to distance themselves from EA as a whole.
If being thoughtful, sincere and selfless is a core value, it seems like it would be more of a problem if influential people in the community felt they had to embrace the label even if they didn’t think it was valuable or accurate
I suspect a lot of the ‘EA adjacent’ description comes from question marks about particular characteristics EA stances or image rather than doubting some of their friends could benefit from participating in the community, and that part of that is less a rejection of EA altogether and more an acknowledgement they often find themselves at least as closely aligned with people doing great work outside the community.
(Fwiw I technically fit into the “adjacent” bracket from the other side: never been significantly active in the community, like some of its ideas and values—many of which I believed in before ‘EA’ was a thing—and don’t identify with or disagree with other ideas commonly associated with EA, so it wouldn’t really make much sense to call myself an EA)
Are these EA phenomena? Also, are they psychological phenomena?
These things (I guess excluding EA disillusionment), don’t just exist within EA they exist within society in general, so it’s plausibly unfair to call them EA phenomena. Though it also seems to me that for each of these things, there’s somewhat strong fit with EA, and EA culture.
Taking impostor syndrome as an example: EA often particularly values ambitious and talented people. Also, it seems to me there’s something of a culture of assessing and prioritising people on this basis. Insofar as it’s important for people to be successful within EA, it’s also important for people to be seen in a certain way by others (talented, ambitious etc.). In general, the stronger the pressure there is for people to be perceived in a certain way, the more prominent I expect impostor syndrome to be.
(I’m a bit wary of ‘just so’ stories here, but my best guess is that this in fact explanatory).
I think impostor syndrome and other things in this ballpark are often discussed as an individual/ psychological phenomena. I think such framings are pretty useful. And there’s another framing which is seeing it instead as a ~sociological phenomena—these are things which happen in a social context, as a result of different social pressures and incentives within the environment.
I don’t know quite what to conclude here, in a large part because I don’t know how common these things are within EA, and how this compares to other places (or even what the relevant comparison class is). Though tentatively, if I’m asking ‘What does it look like for EA to thrive?’, then part of my answer is ‘being an environment where impostor syndrome, burnout, impact obsession and EA disillusionment are less common’.
I would predict that EA levels of imposter syndrome and burnout are similar to other elite competitive fields (e.g. elite universities and professions) and I’d theorise that levels of ~ competitiveness / pressure are pretty good predictors of levels of ~ imposter syndrome / burnout in different fields. In fact, I’d guess they are probably the best predictors of differences in burnout in different fields[1] except for factors which are selecting for people who are more predisposed towards imposter syndrome / anxiety in general.
Given that, I’d expect there to be some, but limited, capacity for EA to reduce its levels of imposter syndrome and burnout, based on an assumption that we have limited capacity to reduce the overall level of competitiveness in the field. For example, we might think that EA will inevitably be in the top right quadrant of this plot, able only to move up and down a little bit, either by slighly reducing competitiveness/pressure[2] or through changing the limited influence on imposter syndrome/burnout unrelated to competitiveness/pressure.[3]
There is only limited evidence that I’m aware of for this claim. For example, this small study finds a moderately strong correlation between classroom competitiveness and imposter syndrome. But their measures of perceived competitiveness are quite noisy, and include self-reported agreement with statements like “The professor seems to pit students against each other in a competitive manner in this class”, which seem to be of relatively little relevance.
I’m thinking, for example, of increasing the number of secure, desirable EA jobs. But I’m also speculating that a lot of the competitiveness/pressure within EA is endogeneous i.e. many people will just compete for better jobs / more status, and so this would have a relatively modest effect. I don’t have the impression that EA has a particularly competitive culture in the sense of a distinctive culture driving competition, as opposed to high levels of competition arising from EAs largely being very elite, success-oriented go-getting people.
For example, I think there are probably some good ways to improve support in the workplace for these maladies. But I suspect the interventions aren’t very powerful (I’m not aware of any well validated powerful interventions). And I would expect that a lot of the more effective possible interventions are in tension with high levels of competitiveness/pressure in the field, i.e. letting people take long sabbaticals or taking a relaxed stance to people meeting deadlines may be very helpful, but hard to implement or have people take up in a competitive environment.
Conversely, my anecdotal impression is that distinctively EA phenomenon like “impact obsession” are relatively low (fewer than 5% of EAs I know seem to exhibit this to any degree), and my impression is that it may have been more common in the earlier years of EA. My personal impression of the culture of EA is that it does not promote impact obsession very significantly, though that is just personal experience, and my guess is that it is probably concentrated more in particular pockets or networks within the community. Given that supposed lack of pressure from the culture of EA, it seems more driven by individual differences (some people are dramatically more concerned about doing the absolute most good they can and more emotionally affected by it than others), rather than social factors.
There’s a comment by Saulius on an old EA Forum post: ‘[...] I see EA as something that is mostly useful when you are deciding how you want to do good. After you figured it out, there is little reason to continue engaging with it [...]’.
I found this pretty interesting, and it inspired a bunch of thoughts. Here’s a couple of oversimplified models of what it is to pursue the EA project:
Model 1:
EA involves figuring out how to do the most good and doing it.
Figuring out how to do the most good involves working out what cause area is most important and what career path to pursue.
Doing it involves getting such a job and executing well on it.
Model 2:
EA involves figuring out how to do the most good and doing it.
This is less of a two step process, and more of an ongoing cultivation of virtues and attitudes like scout mindset, effectiveness and so on. It involves constant vigilance, or constant attunement. It’s an ongoing process of development, personal, epistemic, moral etc.
I see most EA community building efforts as mostly framed by model 1. For example EA Groups, the EA Forum (perhaps EAG/ EAGx as well, though this is less clear). It seems to me a common pattern for people to engage in these things heavily when getting involved in EA, and then when people are ‘in’ they stop engaging with them, and focus on executing at their job.
Insofar as engaging with these things (EA groups, EA Forum etc.) is a key component of what it is to engage with EA, I’m inclined to agree with the above comment—once you’ve figured out what to do there’s little reason to continue engaging with EA.
I’d like to see more community building efforts, or EA Infrastructure that’s framed around model 2 - things that give EA’s who are already ‘in’ a reason to continue engaging with EA, things that provide them with value in pursuing this project.
I don’t think model 1 and 2 necessarily have to come into conflict. Or at least, I think it’s fine and good for there to be people that see EA as mostly being relevant to a career decision process. And, for people that want to treat the EA project as more like model 2 (an ongoing process of cultivating virtues like scout mindset, effectiveness and so on, I’d be excited to see more community building, or infrastructure which is designed to support them in these aims.
I like this framing and agree that most CB effort seems to go into model 1, which I also spend most of my time working on. Model 2 efforts could help people choose career paths where they upskill or do direct work in organisations that are not EA-aligned. This could reduce the frustration connected with job searches of early career individuals.
For many people, having an EA job is pretty important.
It’s pretty competitive and many people who want EA jobs will not in fact get them.
There’s been some discussion related to this on the EA Forum, focusing in particular on jobseekers. I’m also interested in exploring this dynamic with people who are working in EA jobs.
I expect EA job scarcity not only have an impact on EA jobseekers, but also people who are working in EA jobs.
Given 1 and 2, it seems like for people working in EA jobs it will be pretty important for them to keep their jobs. If the job market is competitive it may not be obvious that they can get another one. (For people who have got one EA job, it will presumably be easier to get another, but maybe not guaranteed).
For someone who’s in a position of scarcity about their EA job, I can imagine this meaning they focus primarily on performing well/ being seen to perform well.
This becomes a problem if what counts as performing well and what is actually good to do comes into conflict. Eg. this might involve things like:
Agreeing with the organisational strategy or one’s manager more than one endorses
Focusing on ensuring that they have achieved certain outputs independent of whether that output seems good
In general I expect that under conditions of scarcity people will be less able to do valuable work (and I mean valuable here as ‘actually good’ as opposed to ’work that is perceived to be valuable).
(If I’m right about this, then one potential answer to ‘what is it for EA to thrive’, is: EAs aren’t in a position of scarcity).
Things I’d be interested to ask people who are working at EA jobs to understand whether this is in fact a thing:
How concerned are you about your perceived performance?
If your employer/ manager/ funder/ relevant people said something like: ‘We have full confidence in you, your job is guaranteed and we want you to focus on whatever you think is best’ - would that change what you focus on? How much?
If your employer/ manager/ funder/ relevant people said something like: ‘We have full confidence in you, your job is guaranteed and we want you to focus on whatever you think is best’ - would that change what you focus on? How much?
My personal impression is that significant increases in unrestricted funding (even if it were a 1-1 replacement for restricted funding) would dramatically change orgs and individual prioritisations in many cases.
To the extent that one thinks that researchers are better placed to identify high value research questions (which, to be clear, one may not in many cases), this seems bad.
Here’s a story you could tell about academia. Academia, is in some sense supposed to be about generating knowledge. But it ends up being ineffective at doing this because of something something incentives. Eg.
Academic jobs are highly competitive
In order to get an academic job, it’s more important to have done things like original research than things like replications.
Things like replications are undersupplied, and the replication crisis happens.
What are the incentives within EA? How does this affect how well EA ends up ‘doing the most good?’. I don’t have a full theory here, though I also suspect that there are ways in which incentives in EA can push against doing the most good. Professional EA group funding is one example:
Professional EA group organisers are often in a bit of a precarious position. Their job depends on their ability to get funding from organisations like CEA or EAIF.
One of the main ways that EA group organisers are assessed is on the basis of things like how well they produce highly engaged EAs, or career plan changes or other such things (I think this is broadly true, though I don’t have a great insight into how CEA assesses groups).
Professional EA group organisers are incentivised to produce these kinds of things. Some potential problems here: It’s hard to assess what counts as a good eg. career, which pushes in the direction of non-standard career options being discounted, often it may make sense for someone to focus on building career capital over working at an EA organisation, but these kinds of things are less obviously/ legibly impactful…
It’s hard to assess what counts as a good eg. career, which pushes in the direction of non-standard career options being discounted, often it may make sense for someone to focus on building career capital over working at an EA organisation, but these kinds of things are less obviously/ legibly impactful…
I agree with the general gist, but my impression is that organisations that focus on career changes and grantmakers have high epistemic humility. When looking at meta organisations focussing on career change, most seem not to break down the changes into types in their quantitative analysis. This leads to a greater focus on case studies where different aspects like prior achievements and unusual career paths can be explained. I assume there is some signalling going on between grantmakers and group organisers where a low-fidelity version might point to standard options, whereas thoughtful grantmakers showcasing a wider variety of pathways as potentially impactful can make a difference.
What is it for EA to thrive?
EA Infrastructure Fund’s Plan to Focus on Principles-First EA includes a proposal:
And a rationale (there’s more detail in the post):
Here I’m spending some time thinking about this, in particular:
What does it mean for EA to thrive?
What projects could push EA in the direction of thriving?
(I work at EAIF. These are my personal views/ thoughts. I’m not speaking on behalf of EAIF here)
What’s going on with ‘EA Adjacents’?
There’s a thing where lots of people will say that they are EA Adjacent rather than EA (funny post related to this). In particular, it seems to me that the closer to the core people are, the less inclined they are to identify themselves with EA. What’s going on here? I don’t know, but it’s an interesting trailhead to me.
Plausibly there are some aspects of EA, the culture, norms, worldview, individuals, organisations etc. that people disagree with or don’t endorse, and so prefer to not identify as EAs.
I’m unsure how much to treat this as reflective of a substantive issue vs. a quirk, or reflective of things being actually fine. At least in terms of EA being a ‘beacon for thoughtful, sincere, and selfless’, it seems a little bit worrying to me that some of the core members of the community aren’t willing to describe themselves as EA.
Perhaps a way of getting to the heart of this is asking people something like: Imagine you’re talking to someone who is thoughtful, sincere and selfless. Would you recommend EA to them? Which parts? How strongly? Would you express any reservations?
Looping back to the question of ‘What is it for EA to thrive?’, one answer is: It’s the kind of community that EA’s would strongly recommend to a thoughtful, sincere and selfless friend.
(Maybe this is too strong—people will probably reasonably have disagreements about what aspects of EA are good and aren’t, and if everyone is very positive on EA in this way, this plausibly means that there’s not enough disagreement in the community. )
I share this impression. Also, we see that satisfaction is lower among people who have been in EA longer compared to newer EAs (though this is not true for self-reported engagement), which seems potentially related. Note that we would expect to see pressure in the opposite direction due to less satisfied people dropping out over time.
I think this implies that there is a substantive non-quirky effect. That said, I imagine some of this may be explained by new EAs simply being particularly enthusiastic in ways which explain stronger identification with EA and higher satisfaction.[1]
One dynamic which I expect explains this is the narcissism of small differences, as people become closer to EA, differences and disagreements become more salient, and so people may become more inclined to want to distance themselves from EA as a whole.
I’m not suggesting any particular causal theory about the relationship between satisfaction and identification.
If being thoughtful, sincere and selfless is a core value, it seems like it would be more of a problem if influential people in the community felt they had to embrace the label even if they didn’t think it was valuable or accurate
I suspect a lot of the ‘EA adjacent’ description comes from question marks about particular characteristics EA stances or image rather than doubting some of their friends could benefit from participating in the community, and that part of that is less a rejection of EA altogether and more an acknowledgement they often find themselves at least as closely aligned with people doing great work outside the community.
(Fwiw I technically fit into the “adjacent” bracket from the other side: never been significantly active in the community, like some of its ideas and values—many of which I believed in before ‘EA’ was a thing—and don’t identify with or disagree with other ideas commonly associated with EA, so it wouldn’t really make much sense to call myself an EA)
Some EA psychological phenomena
Some things that people report in EA:
Impostor Syndrome
Impact obsession
Burnout
EA Disillusionment
Are these EA phenomena? Also, are they psychological phenomena?
These things (I guess excluding EA disillusionment), don’t just exist within EA they exist within society in general, so it’s plausibly unfair to call them EA phenomena. Though it also seems to me that for each of these things, there’s somewhat strong fit with EA, and EA culture.
Taking impostor syndrome as an example: EA often particularly values ambitious and talented people. Also, it seems to me there’s something of a culture of assessing and prioritising people on this basis. Insofar as it’s important for people to be successful within EA, it’s also important for people to be seen in a certain way by others (talented, ambitious etc.). In general, the stronger the pressure there is for people to be perceived in a certain way, the more prominent I expect impostor syndrome to be.
(I’m a bit wary of ‘just so’ stories here, but my best guess is that this in fact explanatory).
I think impostor syndrome and other things in this ballpark are often discussed as an individual/ psychological phenomena. I think such framings are pretty useful. And there’s another framing which is seeing it instead as a ~sociological phenomena—these are things which happen in a social context, as a result of different social pressures and incentives within the environment.
I don’t know quite what to conclude here, in a large part because I don’t know how common these things are within EA, and how this compares to other places (or even what the relevant comparison class is). Though tentatively, if I’m asking ‘What does it look like for EA to thrive?’, then part of my answer is ‘being an environment where impostor syndrome, burnout, impact obsession and EA disillusionment are less common’.
I would predict that EA levels of imposter syndrome and burnout are similar to other elite competitive fields (e.g. elite universities and professions) and I’d theorise that levels of ~ competitiveness / pressure are pretty good predictors of levels of ~ imposter syndrome / burnout in different fields. In fact, I’d guess they are probably the best predictors of differences in burnout in different fields[1] except for factors which are selecting for people who are more predisposed towards imposter syndrome / anxiety in general.
Given that, I’d expect there to be some, but limited, capacity for EA to reduce its levels of imposter syndrome and burnout, based on an assumption that we have limited capacity to reduce the overall level of competitiveness in the field. For example, we might think that EA will inevitably be in the top right quadrant of this plot, able only to move up and down a little bit, either by slighly reducing competitiveness/pressure[2] or through changing the limited influence on imposter syndrome/burnout unrelated to competitiveness/pressure.[3]
There is only limited evidence that I’m aware of for this claim. For example, this small study finds a moderately strong correlation between classroom competitiveness and imposter syndrome. But their measures of perceived competitiveness are quite noisy, and include self-reported agreement with statements like “The professor seems to pit students against each other in a competitive manner in this class”, which seem to be of relatively little relevance.
I’m thinking, for example, of increasing the number of secure, desirable EA jobs. But I’m also speculating that a lot of the competitiveness/pressure within EA is endogeneous i.e. many people will just compete for better jobs / more status, and so this would have a relatively modest effect. I don’t have the impression that EA has a particularly competitive culture in the sense of a distinctive culture driving competition, as opposed to high levels of competition arising from EAs largely being very elite, success-oriented go-getting people.
For example, I think there are probably some good ways to improve support in the workplace for these maladies. But I suspect the interventions aren’t very powerful (I’m not aware of any well validated powerful interventions). And I would expect that a lot of the more effective possible interventions are in tension with high levels of competitiveness/pressure in the field, i.e. letting people take long sabbaticals or taking a relaxed stance to people meeting deadlines may be very helpful, but hard to implement or have people take up in a competitive environment.
Conversely, my anecdotal impression is that distinctively EA phenomenon like “impact obsession” are relatively low (fewer than 5% of EAs I know seem to exhibit this to any degree), and my impression is that it may have been more common in the earlier years of EA. My personal impression of the culture of EA is that it does not promote impact obsession very significantly, though that is just personal experience, and my guess is that it is probably concentrated more in particular pockets or networks within the community. Given that supposed lack of pressure from the culture of EA, it seems more driven by individual differences (some people are dramatically more concerned about doing the absolute most good they can and more emotionally affected by it than others), rather than social factors.
Thanks! (I don’t have an immediate response to this, and found a bunch of the points you’re raising here pretty interesting)
What is the EA project? Also, who is it for?
There’s a comment by Saulius on an old EA Forum post: ‘[...] I see EA as something that is mostly useful when you are deciding how you want to do good. After you figured it out, there is little reason to continue engaging with it [...]’.
I found this pretty interesting, and it inspired a bunch of thoughts. Here’s a couple of oversimplified models of what it is to pursue the EA project:
Model 1:
EA involves figuring out how to do the most good and doing it.
Figuring out how to do the most good involves working out what cause area is most important and what career path to pursue.
Doing it involves getting such a job and executing well on it.
Model 2:
EA involves figuring out how to do the most good and doing it.
This is less of a two step process, and more of an ongoing cultivation of virtues and attitudes like scout mindset, effectiveness and so on. It involves constant vigilance, or constant attunement. It’s an ongoing process of development, personal, epistemic, moral etc.
I see most EA community building efforts as mostly framed by model 1. For example EA Groups, the EA Forum (perhaps EAG/ EAGx as well, though this is less clear). It seems to me a common pattern for people to engage in these things heavily when getting involved in EA, and then when people are ‘in’ they stop engaging with them, and focus on executing at their job.
Insofar as engaging with these things (EA groups, EA Forum etc.) is a key component of what it is to engage with EA, I’m inclined to agree with the above comment—once you’ve figured out what to do there’s little reason to continue engaging with EA.
I’d like to see more community building efforts, or EA Infrastructure that’s framed around model 2 - things that give EA’s who are already ‘in’ a reason to continue engaging with EA, things that provide them with value in pursuing this project.
I don’t think model 1 and 2 necessarily have to come into conflict. Or at least, I think it’s fine and good for there to be people that see EA as mostly being relevant to a career decision process. And, for people that want to treat the EA project as more like model 2 (an ongoing process of cultivating virtues like scout mindset, effectiveness and so on, I’d be excited to see more community building, or infrastructure which is designed to support them in these aims.
I like this framing and agree that most CB effort seems to go into model 1, which I also spend most of my time working on. Model 2 efforts could help people choose career paths where they upskill or do direct work in organisations that are not EA-aligned. This could reduce the frustration connected with job searches of early career individuals.
EA Jobs, Scarcity and Performance
It seems like:
For many people, having an EA job is pretty important.
It’s pretty competitive and many people who want EA jobs will not in fact get them.
There’s been some discussion related to this on the EA Forum, focusing in particular on jobseekers. I’m also interested in exploring this dynamic with people who are working in EA jobs.
I expect EA job scarcity not only have an impact on EA jobseekers, but also people who are working in EA jobs.
Given 1 and 2, it seems like for people working in EA jobs it will be pretty important for them to keep their jobs. If the job market is competitive it may not be obvious that they can get another one. (For people who have got one EA job, it will presumably be easier to get another, but maybe not guaranteed).
For someone who’s in a position of scarcity about their EA job, I can imagine this meaning they focus primarily on performing well/ being seen to perform well.
This becomes a problem if what counts as performing well and what is actually good to do comes into conflict. Eg. this might involve things like:
Agreeing with the organisational strategy or one’s manager more than one endorses
Focusing on ensuring that they have achieved certain outputs independent of whether that output seems good
In general I expect that under conditions of scarcity people will be less able to do valuable work (and I mean valuable here as ‘actually good’ as opposed to ’work that is perceived to be valuable).
(If I’m right about this, then one potential answer to ‘what is it for EA to thrive’, is: EAs aren’t in a position of scarcity).
Things I’d be interested to ask people who are working at EA jobs to understand whether this is in fact a thing:
How concerned are you about your perceived performance?
If your employer/ manager/ funder/ relevant people said something like: ‘We have full confidence in you, your job is guaranteed and we want you to focus on whatever you think is best’ - would that change what you focus on? How much?
My personal impression is that significant increases in unrestricted funding (even if it were a 1-1 replacement for restricted funding) would dramatically change orgs and individual prioritisations in many cases.
To the extent that one thinks that researchers are better placed to identify high value research questions (which, to be clear, one may not in many cases), this seems bad.
Incentives within EA
Here’s a story you could tell about academia. Academia, is in some sense supposed to be about generating knowledge. But it ends up being ineffective at doing this because of something something incentives. Eg.
Academic jobs are highly competitive
In order to get an academic job, it’s more important to have done things like original research than things like replications.
Things like replications are undersupplied, and the replication crisis happens.
What are the incentives within EA? How does this affect how well EA ends up ‘doing the most good?’. I don’t have a full theory here, though I also suspect that there are ways in which incentives in EA can push against doing the most good. Professional EA group funding is one example:
Professional EA group organisers are often in a bit of a precarious position. Their job depends on their ability to get funding from organisations like CEA or EAIF.
One of the main ways that EA group organisers are assessed is on the basis of things like how well they produce highly engaged EAs, or career plan changes or other such things (I think this is broadly true, though I don’t have a great insight into how CEA assesses groups).
Professional EA group organisers are incentivised to produce these kinds of things. Some potential problems here: It’s hard to assess what counts as a good eg. career, which pushes in the direction of non-standard career options being discounted, often it may make sense for someone to focus on building career capital over working at an EA organisation, but these kinds of things are less obviously/ legibly impactful…
I agree with the general gist, but my impression is that organisations that focus on career changes and grantmakers have high epistemic humility. When looking at meta organisations focussing on career change, most seem not to break down the changes into types in their quantitative analysis. This leads to a greater focus on case studies where different aspects like prior achievements and unusual career paths can be explained. I assume there is some signalling going on between grantmakers and group organisers where a low-fidelity version might point to standard options, whereas thoughtful grantmakers showcasing a wider variety of pathways as potentially impactful can make a difference.