Just broaden your conception of the team to the whole EA community, and stop worrying about how much of the “credit” is yours.
To me, this is the crux. If you can flip that switch, problem (practically) solved—you can take on huge amounts of personal risk, safe in the knowledge that the community as a whole is diversified.
Easier said than done, though: by and large, humans aren’t wired that way. If there’s a psychological hurdle tougher than the idea that you should give away everything you have, it the idea that you should give away everything you have to uncertain payout.
What if you and your friend bring the same skills and effort to the team, each of you taking big bets on cause areas, but your friend’s bets pay out and yours don’t? All credit goes to your friend, and you feel like a failure. Of course you do!—because effort and skill and luck are all hopelessly tangled up; your friend will be (rightfully) seen as effortful and skilled, and no one will ever be able to tell how hard you tried.
What can make that possibility less daunting?
Notice when you’re thinking in terms of moral luck. Try to appreciate your teammates for their efforts, and appreciate them extra for taking risks.
Get close with your team. There’s a big difference, I expect, between knowing you’re a cog in a machine and feeling the machine operating around you. A religious person who goes to church every day is a cog in a visceral machine. An EA who works in a non-EA field and reads blogs to stay up to date on team strategy might feel like a cog in a remote, nebulous machine.
That’s all good, intuitive advice. I’d considered something like moral luck before but hadn’t heard the official term, so thanks for the link.
I imagine it could also help, psychologically, to donate somewhere safe if your work is particularly risky. That way you build a safety net. In the best case, your work saves the world; in the worst case, you’re earning to give and saving lives anyway, which is nothing to sneeze at.
My human capital may best position me to focus my work on one cause to the exclusion of others. But my money is equally deliverable to any of them. So it shouldn’t be inefficient to hedge bets in this way if the causes are equally good.
To me, this is the crux. If you can flip that switch, problem (practically) solved—you can take on huge amounts of personal risk, safe in the knowledge that the community as a whole is diversified.
Easier said than done, though: by and large, humans aren’t wired that way. If there’s a psychological hurdle tougher than the idea that you should give away everything you have, it the idea that you should give away everything you have to uncertain payout.
What if you and your friend bring the same skills and effort to the team, each of you taking big bets on cause areas, but your friend’s bets pay out and yours don’t? All credit goes to your friend, and you feel like a failure. Of course you do!—because effort and skill and luck are all hopelessly tangled up; your friend will be (rightfully) seen as effortful and skilled, and no one will ever be able to tell how hard you tried.
What can make that possibility less daunting?
Notice when you’re thinking in terms of moral luck. Try to appreciate your teammates for their efforts, and appreciate them extra for taking risks.
Get close with your team. There’s a big difference, I expect, between knowing you’re a cog in a machine and feeling the machine operating around you. A religious person who goes to church every day is a cog in a visceral machine. An EA who works in a non-EA field and reads blogs to stay up to date on team strategy might feel like a cog in a remote, nebulous machine.
That’s all good, intuitive advice. I’d considered something like moral luck before but hadn’t heard the official term, so thanks for the link.
I imagine it could also help, psychologically, to donate somewhere safe if your work is particularly risky. That way you build a safety net. In the best case, your work saves the world; in the worst case, you’re earning to give and saving lives anyway, which is nothing to sneeze at.
My human capital may best position me to focus my work on one cause to the exclusion of others. But my money is equally deliverable to any of them. So it shouldn’t be inefficient to hedge bets in this way if the causes are equally good.