I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I’ve looked at the posts you shared and I don’t find them compelling.
I think the best previous argument for Earning to Give is that you as a small donor might be able to fund things that the major funders won’t or can’t, but my current sense is that bar is sufficiently low that it is now very hard to find such opportunities (within the x-risk/lontermist space and framework at least). Things that seem like remotely a good idea get funding now.
I think that the reason we’re not hiring more people isn’t for lack of money, as discussed on that post.
There might be crazy future scenarios where EA suddenly needs a tremendous amount of money, more than all the funders currently have (or will have), in which case additional funds might be useful, but...it seems if we really thought this was the case, the big funders should raise the bar and not fund as many things as generously as they do.
I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
OK. I guess it would be better to have phrased it a little differently … make it more like ‘my belief is, and the consensus of people I’ve spoken with … in the context of longtermist and x-risk causes’
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I agree with this, which is why I also said something like ‘and I think ETG actually has great value’
I’ve looked at the posts you shared and I don’t find them compelling.′
What about the LW post? That seems like the most compelling one to me that ‘actually you probably could use more money to hire better people into AI research etc, it just isn’t being done right’.
My basic skepticism is sort of a classical economics argument. Unless intrinsic motivation is both rare and extremely important...
if ‘problem X needs more talent’ you should be able to hire people to consider problem X, subsidize training people to build skills to address X, fund prizes for solutions to X, etc.
If the issue is ‘the problems are not defined well enough’, you also should be able to fund people to target these problems, maybe fund people to refocus their research on these problems.
My fear is that the ‘ETG is not important’ is coming from a sort of drop-in-the-ocean fallacy (“there’s already $1 billion going into X, so my $10,000 can’t make a difference”)
I also think that some of the critiques about “we don’t know what to do next in X-risk/S-risk that isn’t being funded” probably also apply to direct work. If we don’t know what to do/fund, then how do we know that an additional EA skilling/focusing on this stuff will have a major impact?
Thank you for the detailed reply!
I agree that Earning to Give may make sense if you’re neartermist or don’t share the full moral framework. This is why my next sentence beings “if you’d be donating to longtermist/x-risk causes.” I could have emphasized these caveats more.
I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.
I’ve looked at the posts you shared and I don’t find them compelling.
I think the best previous argument for Earning to Give is that you as a small donor might be able to fund things that the major funders won’t or can’t, but my current sense is that bar is sufficiently low that it is now very hard to find such opportunities (within the x-risk/lontermist space and framework at least). Things that seem like remotely a good idea get funding now.
I think that the reason we’re not hiring more people isn’t for lack of money, as discussed on that post.
There might be crazy future scenarios where EA suddenly needs a tremendous amount of money, more than all the funders currently have (or will have), in which case additional funds might be useful, but...it seems if we really thought this was the case, the big funders should raise the bar and not fund as many things as generously as they do.
OK. I guess it would be better to have phrased it a little differently … make it more like ‘my belief is, and the consensus of people I’ve spoken with … in the context of longtermist and x-risk causes’
I agree with this, which is why I also said something like ‘and I think ETG actually has great value’
What about the LW post? That seems like the most compelling one to me that ‘actually you probably could use more money to hire better people into AI research etc, it just isn’t being done right’.
My basic skepticism is sort of a classical economics argument. Unless intrinsic motivation is both rare and extremely important...
if ‘problem X needs more talent’ you should be able to hire people to consider problem X, subsidize training people to build skills to address X, fund prizes for solutions to X, etc.
If the issue is ‘the problems are not defined well enough’, you also should be able to fund people to target these problems, maybe fund people to refocus their research on these problems.
My fear is that the ‘ETG is not important’ is coming from a sort of drop-in-the-ocean fallacy (“there’s already $1 billion going into X, so my $10,000 can’t make a difference”)
I also think that some of the critiques about “we don’t know what to do next in X-risk/S-risk that isn’t being funded” probably also apply to direct work. If we don’t know what to do/fund, then how do we know that an additional EA skilling/focusing on this stuff will have a major impact?