I share your worries about the effects on culture. At the same time I donât see this vision as bad:
For many months, they will sit down many days a week and ask themselves the question âhow can I write this grant proposal in a way that person X will approve ofâ or âhow can I impress these people at organization Y so that I can get a job there?â, and they will write long Google Docs to their colleagues about their models and theories of you, and spend dozens of hours thinking specifically about how to get you to do what they want, while drawing up flowcharts that will include your name, your preferences, and your interests.
Imagine a global health charity that wants to get on the GiveWell Top Charities list. Wouldnât we want it to spend much time thinking about how to get there, ultimately changing the way it works in order to come up with the evidence needed to get included? For example, Helen Keller International was founded more than 100 years ago and its vitamin A supplementation program is recommended by GiveWell. I would love to see more external organisations change in order to get EA grants instead of us trying to reinvent the wheel where others might already be good.
Organisations getting started or changing based on the available funding of the EA community seems like a win to me. As long as they have a mission that is aligned with what EA funders want and they are internally mission-aligned we should be fine. I donât know enough about Anthropic for example but they just raised $580M mainly from EAs while not intending to make a profit. This could be a good signal to more organisations out there trying to set up a model where they are interesting to EA funders.
In the end, it comes down to the research and decision making of the grantmaker. GiveWell has a process where they evaluate charities based on effectiveness. In the longterism and meta space, we often donât have such evidence so we may sometimes rely more on the value alignment of people. Ideally, we would want to reduce this dependence and see more ways to independently evaluate grants regardless of the people getting them.
I was going to write an elaborate rebuttal of the parent comment.
In that rebuttal, I was going to say thereâs a striking lack of confidence. The concerns seems like a pretty broad argument against building any business or non-profit organization with a virtuous culture. Thereâs many counterexamples against this argumentâand most have the additional burden of balancing that growth while tackling existential issues like funding.
Itâs also curious that corruption and unwieldly growth has to set in exactly now, versus say with the $8B in 2019.
I donât know enough about Anthropic for example but they just raised $580M mainly from EAs while not intending to make a profit. This could be a good signal to more organisations out there trying to set up a model where they are interesting to EA funders.
Now I sort of see how, combined with several other factors, how maintaining culture and dealing with adverse selection (âlemonsâ) might be an issue.
I share your worries about the effects on culture. At the same time I donât see this vision as bad:
Imagine a global health charity that wants to get on the GiveWell Top Charities list. Wouldnât we want it to spend much time thinking about how to get there, ultimately changing the way it works in order to come up with the evidence needed to get included? For example, Helen Keller International was founded more than 100 years ago and its vitamin A supplementation program is recommended by GiveWell. I would love to see more external organisations change in order to get EA grants instead of us trying to reinvent the wheel where others might already be good.
Organisations getting started or changing based on the available funding of the EA community seems like a win to me. As long as they have a mission that is aligned with what EA funders want and they are internally mission-aligned we should be fine. I donât know enough about Anthropic for example but they just raised $580M mainly from EAs while not intending to make a profit. This could be a good signal to more organisations out there trying to set up a model where they are interesting to EA funders.
In the end, it comes down to the research and decision making of the grantmaker. GiveWell has a process where they evaluate charities based on effectiveness. In the longterism and meta space, we often donât have such evidence so we may sometimes rely more on the value alignment of people. Ideally, we would want to reduce this dependence and see more ways to independently evaluate grants regardless of the people getting them.
I was going to write an elaborate rebuttal of the parent comment.
In that rebuttal, I was going to say thereâs a striking lack of confidence. The concerns seems like a pretty broad argument against building any business or non-profit organization with a virtuous culture. Thereâs many counterexamples against this argumentâand most have the additional burden of balancing that growth while tackling existential issues like funding.
Itâs also curious that corruption and unwieldly growth has to set in exactly now, versus say with the $8B in 2019.
Now I sort of see how, combined with several other factors, how maintaining culture and dealing with adverse selection (âlemonsâ) might be an issue.