This is a beautiful piece Abraham, thanks for writing it. I feel very similarly to you. I thought this EA Forum post from the FTX-era hit a few of these concerns well, as well as this comment from Benjamin Todd:
One way to see the problem is that in the past we used frugality as a hard-to-fake signal of altruism, but that signal no longer works.
Basically, it feels harder to know who is genuine and who to trust vs who is involved for the various status-based and financial incentives. Whilst not new, I feel like I’ve seen an increasing number of organisations/individuals who are functionally cosplaying being interested in EA, to increase their chances of getting funding. This makes me sad—I would love not to have to question people’s motives like that, but it feels necessary sometimes.
Also, the demanding part of EA is something I really value too (in fact, I wrote a relatively controversial post on some issues with paying high salaries in EA orgs shortly before the FTX crash). On the frugality aspects of demandingness: I feel torn on how to navigate this. As I say in the post above, I worry about losing some ideological commitment (and related impact-focused decision-making) by paying generous salaries and attracting new people. But at the same time, I am very happy that we can pay more as a movement, if it means attracting great people. Similarly, even though people can often fairly justify spending significant chunks of money to increase their productivity, this kind of thinking still makes me uneasy sometimes (the most obvious example being a $2kcoffee table).
Thanks James! I liked the old piece. I have no idea how to handle the pay questions: I think my default answer is something like “pay people reasonably well such that they can save for retirement, have families, etc” but that view just collapses when you’re competing with the market in many ways. And I think the AI space feels it especially hard — they have to compete directly with labs for talent.
But yeah, I think I don’t really know how to sit with all of this. I think maybe it’s just a set of feelings I don’t want to be unsaid. But I also worry that things that have pushed the community to find really interesting, unusual opportunities have come from the community being narrow, high-trust, and high-truth seeking, which might change with the growth.
This is a beautiful piece Abraham, thanks for writing it. I feel very similarly to you. I thought this EA Forum post from the FTX-era hit a few of these concerns well, as well as this comment from Benjamin Todd:
Basically, it feels harder to know who is genuine and who to trust vs who is involved for the various status-based and financial incentives. Whilst not new, I feel like I’ve seen an increasing number of organisations/individuals who are functionally cosplaying being interested in EA, to increase their chances of getting funding. This makes me sad—I would love not to have to question people’s motives like that, but it feels necessary sometimes.
Also, the demanding part of EA is something I really value too (in fact, I wrote a relatively controversial post on some issues with paying high salaries in EA orgs shortly before the FTX crash). On the frugality aspects of demandingness: I feel torn on how to navigate this. As I say in the post above, I worry about losing some ideological commitment (and related impact-focused decision-making) by paying generous salaries and attracting new people. But at the same time, I am very happy that we can pay more as a movement, if it means attracting great people. Similarly, even though people can often fairly justify spending significant chunks of money to increase their productivity, this kind of thinking still makes me uneasy sometimes (the most obvious example being a $2k coffee table).
Thanks James! I liked the old piece. I have no idea how to handle the pay questions: I think my default answer is something like “pay people reasonably well such that they can save for retirement, have families, etc” but that view just collapses when you’re competing with the market in many ways. And I think the AI space feels it especially hard — they have to compete directly with labs for talent.
But yeah, I think I don’t really know how to sit with all of this. I think maybe it’s just a set of feelings I don’t want to be unsaid. But I also worry that things that have pushed the community to find really interesting, unusual opportunities have come from the community being narrow, high-trust, and high-truth seeking, which might change with the growth.