The flip side is that maybe EA social incentives, and incentives in general, should be structured to reward impact more than it currently does. I’m not sure how to do this well, but ideas include:
reward people for EA impressiveness more than social impressiveness
reward people for actual work done more than EA impressiveness
reward people based on ex ante altruistic payoff rather than ex post payoff
upweight (relative to our normal social instincts) the value of work that’s less naturally impressive (e.g. “boring” ops stuff)
reward research that is useful more than research that is intellectually interesting or impressive
being more scope sensitive about impact in our social instincts (e.g. I think if I make one great grant, people will be impressed. If I make five great grants, people will be more impressed, but much less than 5x as impressed).
reward the virtue of silence
reward people who do great work but engage less with the community
conversely, socially reward people less if they already get a lot of non-EA social rewards because it’s well-regarded for non-EA reasons outside of our community
Unfortunately I don’t have good ideas for how to implement this in practice.
I guess some of those things you could reward monetarily. Monetary rewards seem easier to steer than more nebulous social rewards (“let’s agree to celebrate this”), even though the latter should be used as well. (Also, what’s monetarily rewarded tends to rise in social esteem; particularly so if the monetary rewards are explicitly given for impact reasons.)
I agree with you that we should reward impact more and I like your suggestions. I think that having more better incentives for searching for and praising/rewarding ‘doers’ is one model to consider. I can imagine a person in CEA being responsible for noticing people who are having underreported impact and offering them conditional grants (e.g., finacial support to transition to do more study/full time work) and providing them with recognition by posting about and praising their work in the forum.
You could spotlight people that do good EA work but are virtually invisible to other EAs and do nothing of their own volition to change that, i.e. non paradise birds and non social butterflies.
The flip side is that maybe EA social incentives, and incentives in general, should be structured to reward impact more than it currently does. I’m not sure how to do this well, but ideas include:
reward people for EA impressiveness more than social impressiveness
reward people for actual work done more than EA impressiveness
reward people based on ex ante altruistic payoff rather than ex post payoff
upweight (relative to our normal social instincts) the value of work that’s less naturally impressive (e.g. “boring” ops stuff)
reward research that is useful more than research that is intellectually interesting or impressive
being more scope sensitive about impact in our social instincts (e.g. I think if I make one great grant, people will be impressed. If I make five great grants, people will be more impressed, but much less than 5x as impressed).
reward the virtue of silence
reward people who do great work but engage less with the community
conversely, socially reward people less if they already get a lot of non-EA social rewards because it’s well-regarded for non-EA reasons outside of our community
Unfortunately I don’t have good ideas for how to implement this in practice.
I guess some of those things you could reward monetarily. Monetary rewards seem easier to steer than more nebulous social rewards (“let’s agree to celebrate this”), even though the latter should be used as well. (Also, what’s monetarily rewarded tends to rise in social esteem; particularly so if the monetary rewards are explicitly given for impact reasons.)
Yes, I like the idea of monetary rewards.
Some things might need a lot less agreed upon celebration in EA, like DEI jobs and applicants and DEI styled community managenent.
I agree with you that we should reward impact more and I like your suggestions. I think that having more better incentives for searching for and praising/rewarding ‘doers’ is one model to consider. I can imagine a person in CEA being responsible for noticing people who are having underreported impact and offering them conditional grants (e.g., finacial support to transition to do more study/full time work) and providing them with recognition by posting about and praising their work in the forum.
I would be quite curious to know how this could work!
You could spotlight people that do good EA work but are virtually invisible to other EAs and do nothing of their own volition to change that, i.e. non paradise birds and non social butterflies.