Are we not double-counting “good done” if we follow the guidelines in that post? I.e. by attributing full good done “credit” to the recruiter, we implicitly either can’t attribute it also to the EA-convert. But usually we also “credit” the person doing the EtG (or other EA activity) with the full amount of good they do.
Like, it would seem strange if all the EA good I did for the rest of my life was credited to the people who helped recruit me. They are awesome and deserve lots of praise and credit as well, but perhaps not all or most of the credit for the EA hours I work.
If most people who will become EAs are basically EAs-in-waiting (they just need to hear the magic EA words or whatever), then the recruiter is probably partly responsible for how much faster he or she made this happen, but not the lifetime good done. If he or she made a person learn about EA a year earlier, that’s maybe 1-2 lives saved (if the convert makes like 33-66k per year).
It’s possible for two groups to be casually necessary for something to happen, and in that case, both are 100% responsible for causing the event.
e.g. I tell the hitman where the target is; the hitman shoots the target. The target wouldn’t have died if either of us had failed, so we both 100% caused the person to die.
e.g. 2. I think it’s true that if Will and Toby never existed, I would have never done any EA stuff, so firstly they do get the “credit” in some sense, secondly both may have been casually necessary.
But then also note:
1) “causes” doesn’t equal “gets the credit for” in the usual sense of “credit”.
2) In real life situations, it’s usually true that if one person didn’t persuade X of something, then someone else would have persuaded them later. Once you properly think through the counterfactuals and uncertainties, it’s rarely true that one person 100% caused anything. So if you “recruit” an EA, you don’t probably cause all of their impact. It’s probably better to model it as a speed-up like you suggest.
I’m not sure assigning credit is a good way to think about this. Instead there are just decisions you can make, and you want to make the ones that most improve the world?
But Claire’s talking about moral accounting—which is a way of thinking about those decisions. In any one project etc. there’s more than one action or person who could have done something else and let it fall apart—so you can’t say its worth x me staying involved otherwise it wouldn’t have happened and all the other people doing the same and be consistent in your approach across opportunities—as maybe you could all individually have done x/2 yourselves seperately but thought that keeping the project going was more important. To get past it with perfection you can factor in all the counter-factuals—but that’s often easier said than done!
Are we not double-counting “good done” if we follow the guidelines in that post? I.e. by attributing full good done “credit” to the recruiter, we implicitly either can’t attribute it also to the EA-convert. But usually we also “credit” the person doing the EtG (or other EA activity) with the full amount of good they do.
Like, it would seem strange if all the EA good I did for the rest of my life was credited to the people who helped recruit me. They are awesome and deserve lots of praise and credit as well, but perhaps not all or most of the credit for the EA hours I work.
If most people who will become EAs are basically EAs-in-waiting (they just need to hear the magic EA words or whatever), then the recruiter is probably partly responsible for how much faster he or she made this happen, but not the lifetime good done. If he or she made a person learn about EA a year earlier, that’s maybe 1-2 lives saved (if the convert makes like 33-66k per year).
It’s possible for two groups to be casually necessary for something to happen, and in that case, both are 100% responsible for causing the event.
e.g. I tell the hitman where the target is; the hitman shoots the target. The target wouldn’t have died if either of us had failed, so we both 100% caused the person to die.
e.g. 2. I think it’s true that if Will and Toby never existed, I would have never done any EA stuff, so firstly they do get the “credit” in some sense, secondly both may have been casually necessary.
But then also note:
1) “causes” doesn’t equal “gets the credit for” in the usual sense of “credit”.
2) In real life situations, it’s usually true that if one person didn’t persuade X of something, then someone else would have persuaded them later. Once you properly think through the counterfactuals and uncertainties, it’s rarely true that one person 100% caused anything. So if you “recruit” an EA, you don’t probably cause all of their impact. It’s probably better to model it as a speed-up like you suggest.
hitman is replacable
I’m not sure assigning credit is a good way to think about this. Instead there are just decisions you can make, and you want to make the ones that most improve the world?
But Claire’s talking about moral accounting—which is a way of thinking about those decisions. In any one project etc. there’s more than one action or person who could have done something else and let it fall apart—so you can’t say its worth x me staying involved otherwise it wouldn’t have happened and all the other people doing the same and be consistent in your approach across opportunities—as maybe you could all individually have done x/2 yourselves seperately but thought that keeping the project going was more important. To get past it with perfection you can factor in all the counter-factuals—but that’s often easier said than done!