This comment is a generic, low information poke at the excellent article:
One of the takeaways of this article is that there has been a dramatic expansion in EA funding, increasing the overhang of “money” (over “talent”).
I think this reasonably creates the impression that EA funding is now very abundant.
I’m interested or worried about the unintended side effects of this impression:
For an analogy, imagine making a statement about the EA movement needing more “skill in biology”. In response, this updates conscientious, strong EAs who change careers. However, what was actually needed was world class leaders in biology whose stellar careers involve special initial conditions. Unfortunately, this means that the efforts made by even very strong EAs were wasted.
I think such misperceptions can occur unintentionally. This motivates this comment.
With this motivation, it might be useful to interrogate these statements to try to get at the “qualia” or less tangible character behind the impression given of the new funding.
I’m not sure how to do this interrogation well. I’ve written speculative and likely unfairly aggressive scenarios about side effects from this impression of funding:
It undermines Earning to Give efforts that can give very large and hidden value to the movement (development of deep operational skills and benevolent coordination among EAs in industry, government and policy makers)
The concentration undermines nimbler funding of smaller, nascent organizations. To explain Open Phil funding can be hard to get (this is offset by thoughtful, generous creation of orgs such as EA Funds). These issues may increase with greater centralization, and also the perception of ample money may undermine the funding of small orgs.
The major new driver of funding is related to cryptocurrency. There are few industries as volatile or uncertain. While you specifically flag this, I’m worried this will be buried by the “top line” statement which reads EA funding has increased—if this turns out not to be true, choices and actions may have been made that decrease access to funding.
This concern is slightly different and related to alignment: there may shifts in the EA movement as a result of new, very large funders. This concern increases due to the unusually conscientious, modest nature of the EA community who update readily, and because new funders may be focus in relatively few and less established cause areas. While we should expect some cultural change as a result of a major increase of funding, these factors may make the community unstable or reduce cohesion.
Normal concerns about diversity of funding.
My first sentence, about this comment being low information and maybe unfounded, was not rhetorical.
I am happy to be completely wrong in every way!
It’s an absolute good if EAs are updated by changing resources, and articles like are invaluable.
Agree it’s good to think about these things. Our past messaging wasn’t nuanced enough—I tried to correct for those issues in the main post, but there are probably going to be new messaging issues.
One quick comment is that I’m pretty worried about issues in the opposite direction e.g. that people aren’t being ambitious enough:
Most EA orgs are designed to use at most tens of millions of dollars per year of funding, but we should be trying to think of projects that could deploy $100m+ per year.
For an analogy, imagine making a statement about the EA movement needing more “skill in biology”. In response, this updates conscientious, strong EAs who change careers. However, what was actually needed was world class leaders in biology whose stellar careers involve special initial conditions. Unfortunately, this means that the efforts made by even very strong EAs were wasted.
This doesn’t immediately strike me as a bad outcome, ex-ante. It’s very hard to know (1) who will become world class researchers or (2) if non-world-class people move the needle by influencing the direction of their field ever-so-slightly (maybe by increasing the incentives to work on an EA-problem by increasing citations here, peer-reviewing these papers, etc.). I, by no means, am world class, but I’ve written papers that (I hope) pave the way for better people to work on animal welfare in economics; participate in and attend conferences on welfare economics; signed a consensus statement on research methodology in population ethics; try to be a supportive/encouraging colleague of welfare-economists working on GPR topics; etc. I also worked under a world-class researcher in grad school and now sometimes serve as a glorified assistant (i.e., coauthor) who helps him flesh out and get more of his ideas to paper. In your example, if the community ‘needs more people in biology’ I think the scaffolding of the sorts I try to provide, is probably(?) still impactful. (Caveat: I’m almost certainly over-justifying my own impact, so take this with a grain of salt.)
If 80K was pushing people into undesirable careers with little earnings potential, this might be a legitimate problem. But I think most of the skills built in these HITS based careers are transferrable and won’t leave you in a bad spot.
Both were great. It’s great to hear from your perspective as an economics professor and hear about your work!
Also, thanks for your comment. I think I get what you’re saying:
(It’s not clear why anyone should listen to my opinions about their life choices) but yes, it seems perfectly valid to go into any discipline, and you can have a huge value and generate impact in many paths of life.
Also, there’s a subthread here about elitism that is difficult to unpack, but it seems healthy to discuss “production functions”, skill and related worldviews explicitly at some point.
To be frank, by giving my narrative example, I was trying to touch on past messaging issues that actually happened.
These messaging issues are alluded in this article, also by Benjamin Todd:
Basically, the problem is as suggested in my example—in the past, the need for very specific skills or profiles was misinterpreted as a need for general talent. This did result in bad outcomes.
I chose to give my narrative instead of directly pointing to a past instance of the issue.
By doing this, I hoped to be more approachable to those less familiar with the history. It is also less confrontational while making the same point.
Thanks for writing back—and for the unnecessary complements of my inaugural posts :) -- Charles! I only know the context of mis-messaging around skills at a high level, so it is hard for me to respond without knowing what ‘bad outcomes’ look like. I don’t doubt that something like this could happen, so I now see the point you were trying to make.
I was responding as someone who read your (intentionally not fleshed out) hypothetical and thought the appropriate response might actually be for someone well-suited for ‘biology’ to work on building those broad skills even with a low probability of achieving the original goal.
This comment is a generic, low information poke at the excellent article:
One of the takeaways of this article is that there has been a dramatic expansion in EA funding, increasing the overhang of “money” (over “talent”).
I think this reasonably creates the impression that EA funding is now very abundant.
I’m interested or worried about the unintended side effects of this impression:
For an analogy, imagine making a statement about the EA movement needing more “skill in biology”. In response, this updates conscientious, strong EAs who change careers. However, what was actually needed was world class leaders in biology whose stellar careers involve special initial conditions. Unfortunately, this means that the efforts made by even very strong EAs were wasted.
I think such misperceptions can occur unintentionally. This motivates this comment.
With this motivation, it might be useful to interrogate these statements to try to get at the “qualia” or less tangible character behind the impression given of the new funding.
I’m not sure how to do this interrogation well. I’ve written speculative and likely unfairly aggressive scenarios about side effects from this impression of funding:
It undermines Earning to Give efforts that can give very large and hidden value to the movement (development of deep operational skills and benevolent coordination among EAs in industry, government and policy makers)
The concentration undermines nimbler funding of smaller, nascent organizations. To explain Open Phil funding can be hard to get (this is offset by thoughtful, generous creation of orgs such as EA Funds). These issues may increase with greater centralization, and also the perception of ample money may undermine the funding of small orgs.
The major new driver of funding is related to cryptocurrency. There are few industries as volatile or uncertain. While you specifically flag this, I’m worried this will be buried by the “top line” statement which reads EA funding has increased—if this turns out not to be true, choices and actions may have been made that decrease access to funding.
This concern is slightly different and related to alignment: there may shifts in the EA movement as a result of new, very large funders. This concern increases due to the unusually conscientious, modest nature of the EA community who update readily, and because new funders may be focus in relatively few and less established cause areas. While we should expect some cultural change as a result of a major increase of funding, these factors may make the community unstable or reduce cohesion.
Normal concerns about diversity of funding.
My first sentence, about this comment being low information and maybe unfounded, was not rhetorical.
I am happy to be completely wrong in every way!
It’s an absolute good if EAs are updated by changing resources, and articles like are invaluable.
Does anyone else have any comments about this?
Agree it’s good to think about these things. Our past messaging wasn’t nuanced enough—I tried to correct for those issues in the main post, but there are probably going to be new messaging issues.
One quick comment is that I’m pretty worried about issues in the opposite direction e.g. that people aren’t being ambitious enough:
Most EA orgs are designed to use at most tens of millions of dollars per year of funding, but we should be trying to think of projects that could deploy $100m+ per year.
This doesn’t immediately strike me as a bad outcome, ex-ante. It’s very hard to know (1) who will become world class researchers or (2) if non-world-class people move the needle by influencing the direction of their field ever-so-slightly (maybe by increasing the incentives to work on an EA-problem by increasing citations here, peer-reviewing these papers, etc.). I, by no means, am world class, but I’ve written papers that (I hope) pave the way for better people to work on animal welfare in economics; participate in and attend conferences on welfare economics; signed a consensus statement on research methodology in population ethics; try to be a supportive/encouraging colleague of welfare-economists working on GPR topics; etc. I also worked under a world-class researcher in grad school and now sometimes serve as a glorified assistant (i.e., coauthor) who helps him flesh out and get more of his ideas to paper. In your example, if the community ‘needs more people in biology’ I think the scaffolding of the sorts I try to provide, is probably(?) still impactful. (Caveat: I’m almost certainly over-justifying my own impact, so take this with a grain of salt.)
If 80K was pushing people into undesirable careers with little earnings potential, this might be a legitimate problem. But I think most of the skills built in these HITS based careers are transferrable and won’t leave you in a bad spot.
Hi Kevin!
I saw your excellent posts as an economics professor and also cutting WIFI.
Both were great. It’s great to hear from your perspective as an economics professor and hear about your work!
Also, thanks for your comment. I think I get what you’re saying:
(It’s not clear why anyone should listen to my opinions about their life choices) but yes, it seems perfectly valid to go into any discipline, and you can have a huge value and generate impact in many paths of life.
Also, there’s a subthread here about elitism that is difficult to unpack, but it seems healthy to discuss “production functions”, skill and related worldviews explicitly at some point.
To be frank, by giving my narrative example, I was trying to touch on past messaging issues that actually happened.
These messaging issues are alluded in this article, also by Benjamin Todd:
https://80000hours.org/2018/11/clarifying-talent-gaps/
Basically, the problem is as suggested in my example—in the past, the need for very specific skills or profiles was misinterpreted as a need for general talent. This did result in bad outcomes.
I chose to give my narrative instead of directly pointing to a past instance of the issue.
By doing this, I hoped to be more approachable to those less familiar with the history. It is also less confrontational while making the same point.
Thanks for writing back—and for the unnecessary complements of my inaugural posts :) -- Charles! I only know the context of mis-messaging around skills at a high level, so it is hard for me to respond without knowing what ‘bad outcomes’ look like. I don’t doubt that something like this could happen, so I now see the point you were trying to make.
I was responding as someone who read your (intentionally not fleshed out) hypothetical and thought the appropriate response might actually be for someone well-suited for ‘biology’ to work on building those broad skills even with a low probability of achieving the original goal.