Counterpoint: my casual impression is that status-within-EA is actually quite strongly positively correlated with real-world-impact. The people who are publishing influential books, going on podcasts, publishing research, influencing policy, and changing public consciousness tend to get high status within EA. I can’t really think of any EAs who are doing high-impact outreach who don’t get commensurate status within EA.
So, I think the EA community is doing a pretty good job of solving the ‘status alignment problem’, aligning status-within-EA with real-world-impact.
But I guess one could make a distinction between ‘real-world-impact’ at the level of changing people’s minds in the direction of EA insights and values, versus ‘real-world-impact’ at the level of reducing actual sentient suffering and promoting well-being. The latter might be quite a bit harder to quantify.
Counterpoint: my casual impression is that status-within-EA is actually quite strongly positively correlated with real-world-impact. The people who are publishing influential books, going on podcasts, publishing research, influencing policy, and changing public consciousness tend to get high status within EA. I can’t really think of any EAs who are doing high-impact outreach who don’t get commensurate status within EA.
So, I think the EA community is doing a pretty good job of solving the ‘status alignment problem’, aligning status-within-EA with real-world-impact.
But I guess one could make a distinction between ‘real-world-impact’ at the level of changing people’s minds in the direction of EA insights and values, versus ‘real-world-impact’ at the level of reducing actual sentient suffering and promoting well-being. The latter might be quite a bit harder to quantify.