Interesting—do you have any thoughts as to what status within the community is currently aligned? My recent thought was that we make a mistake by over-emphasizing impact (or success) when it comes to social status, rather than “trying your best on a high EV project regardless of outcome” for instance.
I’m going to replace impact with “expected impact” in my post since really I was thinking about expected impact. I agree that tangible outcomes are given more status than stuff that takes low-probability high-impact bets.
4 main other ways I can think of in which social status in EA isn’t perfectly aligned with expected impact, are:
Working for / with EA orgs gets more social status than working outside EA orgs (I think this is the most significant misalignment)
Longtermist stuff gets more social status than neartermist stuff
Research roles get more social status than ops roles (except for the ops roles right at the top of organisations)
philosophy gets more social status than technical research
I want to agree with you, but I feel like whenever I come up with an example of someone who is high prestige and fits >3 of your 4 criteria, I can think of someone equaly-ish high prestige who is maybe only fulfilling one or none of them. I’ve been wondering about how to study or prove these claims about prestige in the community in less subjective way (although I don’t know how important it would be to actually do this)
Yeah I don’t think it seems important to be sure about how exactly social status might be misaligned with expected impact—I think we should assume that this kind of misalignment will exist by default because people are irrational, and as long as we recognise this we can mitigate the harmful effects by trying to avoid optimising for social status.
Interesting—do you have any thoughts as to what status within the community is currently aligned? My recent thought was that we make a mistake by over-emphasizing impact (or success) when it comes to social status, rather than “trying your best on a high EV project regardless of outcome” for instance.
I’m going to replace impact with “expected impact” in my post since really I was thinking about expected impact. I agree that tangible outcomes are given more status than stuff that takes low-probability high-impact bets.
4 main other ways I can think of in which social status in EA isn’t perfectly aligned with expected impact, are:
Working for / with EA orgs gets more social status than working outside EA orgs (I think this is the most significant misalignment)
Longtermist stuff gets more social status than neartermist stuff
Research roles get more social status than ops roles (except for the ops roles right at the top of organisations)
philosophy gets more social status than technical research
I want to agree with you, but I feel like whenever I come up with an example of someone who is high prestige and fits >3 of your 4 criteria, I can think of someone equaly-ish high prestige who is maybe only fulfilling one or none of them. I’ve been wondering about how to study or prove these claims about prestige in the community in less subjective way (although I don’t know how important it would be to actually do this)
Yeah I don’t think it seems important to be sure about how exactly social status might be misaligned with expected impact—I think we should assume that this kind of misalignment will exist by default because people are irrational, and as long as we recognise this we can mitigate the harmful effects by trying to avoid optimising for social status.