I think point one is simultaneously really important and also a really risky thing for EA to pursue. I think a lot of the current discussion around EA’s reputation has been really defensive/reactive. Part of that is because we’re trying to put out controversy fires, but I also think that EAs are much more comfortable not interacting at all with illegible, traditional systems and undervalue what playing the game well could get us.
As a dramatic example, the UN spent $47 billion in 2021 on a hilariously imbalanced set of 17 development goals. These goals are revised every 15 years. We can try to grab a seat at the table in 2030 or 2045 and align the new goals with EA principles—that could mean billions dollars per year diverted to EA causes assuming current funding levels. (This is a very very conservative assumption btw—funding appears to be increasing steadily YoY)
But on the flipside, would an organization that can get a seat at the UN table still be recognizably EA? Or will we have destroyed the heart of it to get there?
EA has some essential weirdnesses that will mean that it’ll always be a black sheep in the nonprofit industrial complex, and I really don’t want to see us lose those weird things for the sake of increasing reputation/funding. So it’ll be a delicate balancing act.
I personally think that the ideal (but very difficult) way forward is to try to be (and also seem[1]) so staunchly ethical that it warps public opinion away from the rest of the nonprofit ecosystem and towards us, instead of trying to become part of the trad nonprofit in-group. I think the anti-slavery activist and quaker, Benjamin Lay, is an inspirational figure for this path.
To be explicit, this means stopping or setting much higher bars for doing things that are effective and actually are the rational things to do in a social vacuum but burn social capital. Buying castles, having really nice office spaces, paying above market rates for employees at EA charities, overly enthusiastic and conspicuously well funded university groups, etc.
My sense is that a lot of EAs think that trading off reputation to do these effective things is worth it, but my claim is that they think so because they don’t realize how good the upside of having a good reputation is. I certainly didn’t think of positive reputation as having any value before I started working at Samaritans; my model was that reputation is something that you strive to keep non-negative and then you’re basically good.
I also do want to state explicitly that even with the correct model of the upside of having a stellar reputation, it could still be more optimal in the long run to continue to do things that are slightly offputting. “Reputation is priceless” is not literally true; it could be more effort than it’s worth to pursue for EA considering that we have a pretty deep stock of like, google softengs who feel very alienated from the trad nonprofit world etc
I think point one is simultaneously really important and also a really risky thing for EA to pursue. I think a lot of the current discussion around EA’s reputation has been really defensive/reactive. Part of that is because we’re trying to put out controversy fires, but I also think that EAs are much more comfortable not interacting at all with illegible, traditional systems and undervalue what playing the game well could get us.
As a dramatic example, the UN spent $47 billion in 2021 on a hilariously imbalanced set of 17 development goals. These goals are revised every 15 years. We can try to grab a seat at the table in 2030 or 2045 and align the new goals with EA principles—that could mean billions dollars per year diverted to EA causes assuming current funding levels. (This is a very very conservative assumption btw—funding appears to be increasing steadily YoY)
But on the flipside, would an organization that can get a seat at the UN table still be recognizably EA? Or will we have destroyed the heart of it to get there?
EA has some essential weirdnesses that will mean that it’ll always be a black sheep in the nonprofit industrial complex, and I really don’t want to see us lose those weird things for the sake of increasing reputation/funding. So it’ll be a delicate balancing act.
I personally think that the ideal (but very difficult) way forward is to try to be (and also seem[1]) so staunchly ethical that it warps public opinion away from the rest of the nonprofit ecosystem and towards us, instead of trying to become part of the trad nonprofit in-group. I think the anti-slavery activist and quaker, Benjamin Lay, is an inspirational figure for this path.
To be explicit, this means stopping or setting much higher bars for doing things that are effective and actually are the rational things to do in a social vacuum but burn social capital. Buying castles, having really nice office spaces, paying above market rates for employees at EA charities, overly enthusiastic and conspicuously well funded university groups, etc.
My sense is that a lot of EAs think that trading off reputation to do these effective things is worth it, but my claim is that they think so because they don’t realize how good the upside of having a good reputation is. I certainly didn’t think of positive reputation as having any value before I started working at Samaritans; my model was that reputation is something that you strive to keep non-negative and then you’re basically good.
I also do want to state explicitly that even with the correct model of the upside of having a stellar reputation, it could still be more optimal in the long run to continue to do things that are slightly offputting. “Reputation is priceless” is not literally true; it could be more effort than it’s worth to pursue for EA considering that we have a pretty deep stock of like, google softengs who feel very alienated from the trad nonprofit world etc