It’s unclear to me whether you are saying that the potentially huge number of new people in EA will try to take advantage of EA resources for personal gain or that WE, who are currently in EA for altruistic reasons, will do so. The former sounds likely to me, the latter doesn’t.
I might be missing crucial context here since I’m not familiar with the Thielosphere and all that, but overall I also don’t think a huge number of new, unaligned people will be the downfall of EA. As long as leadership, thought-leaders, and grantmakers in EA stay aligned, it may be harder for them to determine whom to give that grant (or that stamp of approval), but wouldn’t that just simply lead to less grants? Which seems bad but not like the end?
Or are you imagining highly intelligent people with impressive resumes who strategically aim to hijack EA resources for their aims and get into important positions in EA?
I think cooperative equilibria are fragile. For example, as salaries have increased in EA, I’ve seen many people who previously took very low salaries now feel much worse about taking low salaries, because their less-aligned colleagues are paid a lot more than them, and this makes them feel much worse about making this additional sacrifice.
Similarly, I’ve seen many people who really cared about honesty, who ended up being in environments where honesty was less valued, and then quickly also adopted less honest norms.
I think EA leadership has a lot of people with strong moral character, but I think assuming that all of these people will completely ignore the incentives around them is overoptimistic. I think we are very likely going to see a shift in the norms among people who have historically acted very selflessly (I don’t feel super happy about using the word “selflessly” here, but explaining why would take a long time, so I’ll leave this parenthetical as a bookmark)
Separately, I also think that yes, we are also going to get very highly intelligence people with impressive resumes who will strategically aim to hijack EA resources for their own aims and get into important positions in EA. I think studying any others social movement, religion or other large social organization will reveal many of those people, and assuming that we will not be under similar pressures strikes me as quite unlikely.
ah, the thing about fragile cooperative equilibria makes sense to me.
I’m not as sure as you that this shift would happen to core EA though. I could also imagine that current EAs will have a very allergic reaction to new, unaligned people coming in and trying to take advantage of EA resources. I imagine something like a counterculture forming where aligned EAs start purposefully setting themselves apart from people who’re only in it for a piece of the pie, by putting even more emphasis on high EA alignment. I believe I’ve already seen small versions of this happening in response to non-altruistic incentives appearing in EA.
The faster the flood of new people and change of incentives happens, the more confident I am in this view. Overall, I’m not extremely confident at all though.
On your last point, if I understand this right this is not the thing you’re most worried about though? Like, these people hijacking EA are not the mechanism by which EA may collapse in your view?
It’s unclear to me whether you are saying that the potentially huge number of new people in EA will try to take advantage of EA resources for personal gain or that WE, who are currently in EA for altruistic reasons, will do so. The former sounds likely to me, the latter doesn’t.
I might be missing crucial context here since I’m not familiar with the Thielosphere and all that, but overall I also don’t think a huge number of new, unaligned people will be the downfall of EA. As long as leadership, thought-leaders, and grantmakers in EA stay aligned, it may be harder for them to determine whom to give that grant (or that stamp of approval), but wouldn’t that just simply lead to less grants? Which seems bad but not like the end?
Or are you imagining highly intelligent people with impressive resumes who strategically aim to hijack EA resources for their aims and get into important positions in EA?
I think cooperative equilibria are fragile. For example, as salaries have increased in EA, I’ve seen many people who previously took very low salaries now feel much worse about taking low salaries, because their less-aligned colleagues are paid a lot more than them, and this makes them feel much worse about making this additional sacrifice.
Similarly, I’ve seen many people who really cared about honesty, who ended up being in environments where honesty was less valued, and then quickly also adopted less honest norms.
I think EA leadership has a lot of people with strong moral character, but I think assuming that all of these people will completely ignore the incentives around them is overoptimistic. I think we are very likely going to see a shift in the norms among people who have historically acted very selflessly (I don’t feel super happy about using the word “selflessly” here, but explaining why would take a long time, so I’ll leave this parenthetical as a bookmark)
Separately, I also think that yes, we are also going to get very highly intelligence people with impressive resumes who will strategically aim to hijack EA resources for their own aims and get into important positions in EA. I think studying any others social movement, religion or other large social organization will reveal many of those people, and assuming that we will not be under similar pressures strikes me as quite unlikely.
ah, the thing about fragile cooperative equilibria makes sense to me.
I’m not as sure as you that this shift would happen to core EA though. I could also imagine that current EAs will have a very allergic reaction to new, unaligned people coming in and trying to take advantage of EA resources. I imagine something like a counterculture forming where aligned EAs start purposefully setting themselves apart from people who’re only in it for a piece of the pie, by putting even more emphasis on high EA alignment. I believe I’ve already seen small versions of this happening in response to non-altruistic incentives appearing in EA.
The faster the flood of new people and change of incentives happens, the more confident I am in this view. Overall, I’m not extremely confident at all though.
On your last point, if I understand this right this is not the thing you’re most worried about though? Like, these people hijacking EA are not the mechanism by which EA may collapse in your view?