I will tentatively say that Earning to Give is dead
ETG is not dead, you shouldn’t declare it dead. Besides being false, as I argue below it’s also de-motivating for the 15-20% of surveyed EAs pursuing this path.
ETG has a strong claim on being the most good you can do for the world.
It is clearly highly valuable if you are not fully long-termist (perhaps because you do not have total population ethics).
Even if you are longtermist, there is still a strong claim for the value of ETG.
It has been de-prioritized by 80k, but they have a specific argument for this that depends on a set of strong claims and a particular (longermist) moral framework. Among other things...
To a significant extent, this follows from the fact that the most promising causes tend to be talent-constrained rather than funding-constrained
I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I’ve looked at the posts you shared and I don’t find them compelling.
I think the best previous argument for Earning to Give is that you as a small donor might be able to fund things that the major funders won’t or can’t, but my current sense is that bar is sufficiently low that it is now very hard to find such opportunities (within the x-risk/lontermist space and framework at least). Things that seem like remotely a good idea get funding now.
I think that the reason we’re not hiring more people isn’t for lack of money, as discussed on that post.
There might be crazy future scenarios where EA suddenly needs a tremendous amount of money, more than all the funders currently have (or will have), in which case additional funds might be useful, but...it seems if we really thought this was the case, the big funders should raise the bar and not fund as many things as generously as they do.
I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
OK. I guess it would be better to have phrased it a little differently … make it more like ‘my belief is, and the consensus of people I’ve spoken with … in the context of longtermist and x-risk causes’
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I agree with this, which is why I also said something like ‘and I think ETG actually has great value’
I’ve looked at the posts you shared and I don’t find them compelling.′
What about the LW post? That seems like the most compelling one to me that ‘actually you probably could use more money to hire better people into AI research etc, it just isn’t being done right’.
My basic skepticism is sort of a classical economics argument. Unless intrinsic motivation is both rare and extremely important...
if ‘problem X needs more talent’ you should be able to hire people to consider problem X, subsidize training people to build skills to address X, fund prizes for solutions to X, etc.
If the issue is ‘the problems are not defined well enough’, you also should be able to fund people to target these problems, maybe fund people to refocus their research on these problems.
My fear is that the ‘ETG is not important’ is coming from a sort of drop-in-the-ocean fallacy (“there’s already $1 billion going into X, so my $10,000 can’t make a difference”)
I also think that some of the critiques about “we don’t know what to do next in X-risk/S-risk that isn’t being funded” probably also apply to direct work. If we don’t know what to do/fund, then how do we know that an additional EA skilling/focusing on this stuff will have a major impact?
ETG is not dead, you shouldn’t declare it dead. Besides being false, as I argue below it’s also de-motivating for the 15-20% of surveyed EAs pursuing this path.
ETG has a strong claim on being the most good you can do for the world.
It is clearly highly valuable if you are not fully long-termist (perhaps because you do not have total population ethics).
Even if you are longtermist, there is still a strong claim for the value of ETG.
It has been de-prioritized by 80k, but they have a specific argument for this that depends on a set of strong claims and a particular (longermist) moral framework. Among other things...
wiki
But there is a strong case that talent can be bought with funding, and more could be done here.
See recent posts:
https://forum.effectivealtruism.org/posts/cjH2puDzAFrtrrThQ/despite-billions-of-extra-funding-small-donors-can-still
https://forum.effectivealtruism.org/posts/K88PSysTyHCiYdXwM/earning-to-give-may-be-the-best-option-for-patient-eas
https://forum.effectivealtruism.org/posts/djK3d6yp7agtCTMGq/is-earning-to-give-more-advantagious
Thank you for the detailed reply!
I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I’ve looked at the posts you shared and I don’t find them compelling.
I think the best previous argument for Earning to Give is that you as a small donor might be able to fund things that the major funders won’t or can’t, but my current sense is that bar is sufficiently low that it is now very hard to find such opportunities (within the x-risk/lontermist space and framework at least). Things that seem like remotely a good idea get funding now.
I think that the reason we’re not hiring more people isn’t for lack of money, as discussed on that post.
There might be crazy future scenarios where EA suddenly needs a tremendous amount of money, more than all the funders currently have (or will have), in which case additional funds might be useful, but...it seems if we really thought this was the case, the big funders should raise the bar and not fund as many things as generously as they do.
OK. I guess it would be better to have phrased it a little differently … make it more like ‘my belief is, and the consensus of people I’ve spoken with … in the context of longtermist and x-risk causes’
I agree with this, which is why I also said something like ‘and I think ETG actually has great value’
What about the LW post? That seems like the most compelling one to me that ‘actually you probably could use more money to hire better people into AI research etc, it just isn’t being done right’.
My basic skepticism is sort of a classical economics argument. Unless intrinsic motivation is both rare and extremely important...
if ‘problem X needs more talent’ you should be able to hire people to consider problem X, subsidize training people to build skills to address X, fund prizes for solutions to X, etc.
If the issue is ‘the problems are not defined well enough’, you also should be able to fund people to target these problems, maybe fund people to refocus their research on these problems.
My fear is that the ‘ETG is not important’ is coming from a sort of drop-in-the-ocean fallacy (“there’s already $1 billion going into X, so my $10,000 can’t make a difference”)
I also think that some of the critiques about “we don’t know what to do next in X-risk/S-risk that isn’t being funded” probably also apply to direct work. If we don’t know what to do/fund, then how do we know that an additional EA skilling/focusing on this stuff will have a major impact?